Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Federal grant approval at a state education agency: an evaluation study
(USC Thesis Other)
Federal grant approval at a state education agency: an evaluation study
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Federal Grant Approval at a State Education Agency: An Evaluation Study
by
Robert Martin Thompson
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
December 2020
Copyright 2020 Robert Martin Thompson
ii
DEDICATION
I dedicate this dissertation to my father, Dr. Dennis Thompson. Over the course of my
life, you have been a warm reminder to stay the course, keep pushing, and to soar with the
eagles. From helping me with school projects, supporting any dream from astronaut to educator,
you have shown me the value of scholarship every step of the way. Thank you for always taking
the time to show me what it means to be of stout heart. This process has brought us closer than I
thought possible and I am eternally grateful for this opportunity. I hope this degree will help me
to stand tall for those who cannot and this next step will influence even more students who did
not have the same privileges that I have had along the way.
iii
ACKNOWLEDGEMENTS
Dr. Min, thank you for your careful, considerate and timely feedback along the way.
Your passion for the work is infectious and you lead by humble example so future Trojans can
find their own way to become change agents. Dr. Canny, thank you for your guidance. You are
a gifted educator and scholar and I am so grateful to have had the privilege to work with you. To
my committee members, I am not sure how I ended up with such a phenomenal group of people,
but I am certainly thankful that I did. Thank you for caring. Thank you for digging in and doing
the work that needs to be done. Your thoughtful and intentional considerations not only helped
strengthen the project, but encouraged me to consider even further implications. I am eternally
grateful for your contributions, Dr. Krop and Dr. Kellar.
To my family, for always valuing education and believing in me. To my mother, Bonnie
Thompson for always allowing me to be curious and encouraging every opportunity possible to
help me become a lifelong-learner. To my stepmother, Jackie Thompson and father, Dr. Dennis
Thompson, your support has helped me to become the man I am today. To my wife, Andrea.
You always knew I could do it and believed in me when I did not believe in myself. This
opportunity is possible because of all of your culminated efforts to shape me as a scholar and to
contribute to the greater good of my community.
This goes out to my USC Tribe, you know who you are. Your unconditional support
taught me how to fight on even when the challenge seemed insurmountable.
Finally, to my office and leadership, especially Dr. Nazie Mohajeri-Nelson. Thank you
for believing in this work and for helping to bring it to fruition.
A big thank you to the school districts who participated to help better the lives and
academic outcomes for their students.
iv
TABLE OF CONTENTS
Dedication
Acknowledgments
List of Tables
List of Figures
Abstract
Chapter One: Introduction
Introduction of the Problem of Practice
Organizational Context and Mission
Organizational Goal
Importance of the Evaluation
Description of Stakeholder Groups
Stakeholder Group for the Study
Purpose of the Project and Questions
Methodological Framework
Definitions
Organization of the Project
Chapter Two: Review of the Literature
Federal Grant Applications Approval from a State Education Agency
Federal Funding for Public Education
Federal Funding Initiatives under the Every Student Succeeds Act
Planning and Implementation of Federally Funded Activities
Role of Stakeholder Group of Focus
Clark and Estes’ (2008) Gap Analysis
Conceptual Framework
Summary
Chapter Three: Methods
Participating Stakeholders
Quantitative Data Collection and Instrumentation
Qualitative Data Collection and Instrumentation
Credibility and Trustworthiness
Validity and Reliability
Ethics
Limitations and Delimitations
Chapter Four: Results and Findings
Participating Stakeholders
Knowledge Results
Motivation Results
ii
iii
vi
vii
viii
1
1
5
6
7
9
9
11
11
13
13
15
15
17
22
25
27
28
44
47
49
50
54
58
60
61
62
64
66
67
70
82
v
Organizational Results
Synthesis
Chapter Five: Solutions and Recommendations
Recommendation for Practice to Address KMO Influences
Integrated Implementation and Evaluation Plan
Strengths and Weaknesses of the Approach
Limitations and Delimitations
Recommendations for Future Research
Conclusion
References
Appendix A
Appendix B
Appendix C
Appendix D
90
101
105
106
125
141
143
144
145
147
156
161
163
164
vi
LIST OF TABLES
Table 1. State Four-Year Graduation Rate by Subgroup
Table 2. Organizational mission, global goal, and stakeholder goals
Table 3. Knowledge Influences, Knowledge Types, and Knowledge Assessment
Table 4. Motivational Influences and Assessments for Analysis
Table 5. Organizational Influences, Organizational Types, and Organization Assessment
Table 6. Quantitative Surveys: Response Rate
Table 7. Quantitative Survey: Years of Service
Table 8. Qualitative Surveys: Response Rate
Table 9. Quantitative Surveys: Years of Service
Table 10. Assumed Knowledge Influences, Determination, and Summary of Findings
Table 11. Question 2: Frequency and Percentage of Responses
Table 12. Efforts to Receive Final Approval Leads to Equitable Outcomes
Table 13. Assumed Motivation Influences, Determination, and Summary of Findings
Table 14. My [LEA] Success on the Application is Dependent on my Knowledge of Federal
Funds
Table 15. My [LEA] Success on the Consolidated Application is Dependent upon the SEA
Reviewer
Table 16. Assumed Organizational Influences, Determination, and Summary of Findings
Table 17. Q14 Compared to Q15 Response Rates
Table 18. Average Hours Spent on Resubmission of Consolidated
Table 19. I Believe Network Meetings Help to Submit an Application
Table 20. Assumed KMO Influences, Determination, and Summary of Findings
Table 21. Assumed KMO Influences, Determination, and Summary of Findings
Table 22. Summary of Knowledge Influences and Recommendations
Table 23. Summary of Motivation Influences and Recommendations
Table 24. Summary of Organizational Influences and Recommendations
Table 25. Outcomes, Metrics, and Methods for External and Internal Outcomes
Table 26. Critical Behaviors, Metrics, Methods, and Timing for Evaluation
Table 27. Required Drivers to Support Critical Behaviors
Table 28. Evaluation of the Components of Learning for the Program
Table 29. Components to Measure Reactions to the Program
4
10
34
38
43
68
68
69
70
71
73
81
83
84
85
90
91
94
98
102
106
109
115
120
127
128
130
134
136
vii
LIST OF FIGURES
Figure 1. Interactive Conceptual Framework
Figure 2. I am Familiar with the Consolidated Application Manual
Figure 3. I Use the Consolidated Application Frequently to Inform Decisions
Figure 4. : Q6 I Feel that My Efforts to Receive Final Approval will Lead to Equitable
Outcomes for Students
Figure 5. Item Q7 versus Item Q8
Figure 6. Hours Spent on the Consolidated Application
Figure 7. Sample District Performance Dashboard
Figure 8. Western State Department of Education Federal Funding Context
45
72
74
82
86
95
139
141
viii
ABSTRACT
The passage of the Elementary and Secondary Education Act as reauthorized by the Every
Student Succeeds Act have afforded opportunities to close achievement gaps for historically
underserved students. The purpose of this study is to examine the need for Local Education
Agencies to receive final approval for K-12 public education federal funding applications. The
project goal is to develop and implement training delivered by the State Education Agency to
help the LEA achieve 100% approval of the Consolidated Application for federal funds.
The research questions used to guide this study are (1) What are the knowledge,
motivation, and organizational needs necessary for the LEA to achieve 100% approval
(compliance) of consolidated applications for federal funding; (2) What are the recommendations
for practice to increase approval on the consolidated application for federal funds? Data
collection was completed through an online, self-reporting LEA grant administrator survey and
open-ended, qualitative survey with SEA grant reviewers. These data were analyzed in using the
convergent parallel mixed methods approach. Consulting a review of the literature, assumed
KMO influences were validated or invalidated using the data sets from both the qualitative and
quantitative components of the study. A comprehensive implementation and evaluation plan
following the New World Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016) is outlined to
measure the effectiveness of the recommendations. The recommendations offered in Chapter
Five are designed to increase the stakeholder’s knowledge, motivation, and organization gaps at
the LEA and SEA level to achieve the organization goal.
Key Terms: Access, Accountability, Attribution, Compliance, Equity, ESEA, ESSA, Evidence-
Based, Federal Funds, NCLB, Self-Efficacy, Organization, Research-Based
1
CHAPTER ONE: INTRODUCTION
Introduction of the Problem of Practice
This study addresses the problem of the percentage of application errors submitted to a
State Educational Agency from Local Education Agencies in one fiscal year cycle in federal
grant-funded applications. The majority of federal funds awarded to K-12 education institutions
are awarded through the Every Student Succeeds Act designed to “provide all children
significant opportunity to receive a fair, equitable, and high-quality education, and to close
educational achievement gaps” (ESSA, 2018). Local Education Agencies apply for federal
funds through the State Education Agency annually. In 2017, 2018, and 2019 fiscal years, the
federal programs unit of a western State Educational Agency approved federal grant applications
to 70% of LEAs (approximately 180 LEAs) within the required 30-day approval period. The
State Education Agency (SEA) and Local Education Agency (LEA) have a shared responsibility
to ensure federal applications for K-12 education initiatives are approved and meet all statutory
compliance indicators within the federally required 30-day approval period consistent with the
application requirements (EDGAR, 2018). Several reasons are contributing to the lack of
approval within the 30-day approval period including issues related to knowledge of the
consolidated application for funds from the state and local level, training and guidance around
compliance issues, and organizational influences that influence timely submission and review of
the application to the state agency. (N. [Redacted], personal communication, February 21
st
,
2019). The timely approval of the applications ensures federal funds flow to the LEAs such that
the educational programming and supports associated with those federal funds are not disrupted.
According to the Western State Department of Education (WSDE, pseudonym) website,
LEAs may begin to obligate funds on July 1
st
, once all criteria for submission are met within the
2
30-day approval period. The state defines this period as the substantial approval deadline.
Criteria to establish substantial approval of the federal grant application from the WSDE to the
LEA includes the LEA plan narrative, assurances of statutory regulations under the Every
Student Succeeds Act (ESSA), a balanced budget, and forms related to intake requirements of
the state. The state guidance continues that the LEA may not obligate funds for supplemental
federal activities until the completed application has been submitted and includes all of the
necessary components to receive approval. The WSDE grant application manual provides the
30-day feedback timeframe for the LEA to receive feedback and final approval. The WSDE
grant application manual further stipulates that the WSDE may award final approval to the
LEA’s application for federal funds once all requirements are met and the LEA plan has been
approved.
Once final approval is granted, LEAs can submit a request to WSDE for funds to receive
reimbursements for the activities proposed in the application. The delay in the approval of
federal grant applications risked interruptions in reimbursement for school district activities,
which may have been obligated by the LEA and not approved by the state agency. In this case,
the LEA is providing services with other funding sources. Because of the supplemental nature of
federal funding, the LEA may not have the resources to fully fund the activities applied for in the
federal funding application. If the activity is not approved by the SEA, the LEA could risk a
delay in hiring personnel, cutting other programs due to lack of funding, or delays in support for
historically underserved students served through federal funding streams.
The evidence highlights that if 100% of school district applications are not accepted
within the 30-day approval deadline, students in school districts without final approval do not
receive the programs funded by federal funds promptly. This problem is important to address
3
because the inability of a district to obligate federal funds can risk delays in hiring new staff,
programming interruptions, and students not receiving the supports they need for the upcoming
school year. The delay in funding impedes equal access to education and therefore violates
students’ civil right to educational accommodations through federally funded programs under
Title IV of the Civil Rights Act of 1964 (Dennis, 2017; USDE, 2017). Once a school district
receives approval, the district may obligate funds for the activities selected (hiring new staff,
implementing programs, purchasing equipment, etc.) using the approved supplemental federal
funding (EDGAR, 2018; ESSA, 2018). Federal funding is facilitated through the statutory
requirements outlined in the Elementary and Secondary Education Act. Through this legislation,
the state and Local Education Agencies engage in a mutually exclusive partnership that is built
for accountability purposes and to increase outcomes for historically underserved students.
Per the federal statute and the state identified ESSA plan, the LEA or school district is
required to submit a plan for addressing the needs of historically underserved students to the
State Education Agency. The accountability embedded in the state report card ensures that local
agencies and state agencies are collaborating to create positive outcomes for students utilizing
federal funding initiatives. The Elementary and Secondary Education Act (ESEA) as
reauthorized by the Every Student Succeeds Act requires states to present their four-year
adjusted cohort high school graduation rates to the United States Department of Education
(USDE). Included in this western state ESSA plan for high school graduation rate is a
disaggregated table containing the following subgroups and their graduation percentage: free and
reduced lunch, English learners, students with disabilities, and students experiencing
homelessness.
4
Table 1
State Four-Year Graduation Rate by Subgroup
Subgroup Four-Year
Graduation Rate
Number of Students
1. Free and Reduced Lunch 70.7% 22,123
2. English Learners 67% 5,654
3. Students with disabilities 58.6% 3,883
4. Students experiencing
homelessness
55.4% 1,747
Each subgroup is measured against a total state graduation rate of 78%. The subgroups
presented all indicate disproportionately low high school graduation rates of all ESSA identified
subgroups of students in K-12 public education in this western state. ESSA includes new
provisions that can be utilized to increase equity for historically underserved students including
underprivileged or low-income, minority, English learners, special education, and homeless
student populations (Cook-Harvey, Darling-Hammond, Lam, Mercer, & Roc, 2016). According
to the state’s ESSA preliminary allocations, the state is allotted over 182 million dollars in
funding to meet the needs of students identified under the identified subgroups.
To engage in supplemental activities through federal funding, a school district is required
to submit plans that address the requirements of the law under ESSA as well as descriptions of
activities that the district will implement with its Title I, II, III, and IV allocations (ESSA, 2018).
Drawing from Senge’s (1990) notion of creative tension, this evaluative study helps inform what
reality currently exists for both the school district and state agency respectively as well as create
a vision towards where the organizations seek to forge a future. This investigation provides a
5
new lens in which the SEA and district can interact and negotiate to create equitable services for
students. As outlined by ESSA, LEAs that meet compliance can begin to consider more
complex issues such as leveraging supplemental federal funds to go beyond compliance in
providing effective services that enhance learning and close achievement gaps.
Organizational Context and Mission
The department of education is located in the western United States and is referred to as
the Western State Department of Education (WSDE, pseudonym). According to the department
website, WSDE serves 911,000 students, over 175 school districts, 1,900 schools, over 50,000
teachers, and 3,000 administrators. WSDE’s vision is to ensure that students graduate from high
school prepared to become contributing citizens of the state by furthering their education in
college or career readiness. WSDE is a large educational agency that utilizes an outcomes-based
system to help determine how well students are being served and to better direct resources to
support student success. The organization is composed of over sixty departments from data and
performance accountability, community partnerships, innovation, learning services, legislative
relationships, auditing, turnaround schools, and a Vision 2020 “Race to the Top” office. WSDE
annually evaluates school districts and schools based on student performance outcomes and
provides a common framework through which to understand performance and focus
improvement efforts. Under the Every Student Succeeds Act of 2015, the federal programs
office of the state is granted supplemental funding to enhance school improvement initiatives at
the Local Education Agency level.
The agency’s strategic action plan contains a five-year strategic improvement plan that
outlines key initiatives and goals. Some of the initiatives include fostering new high school
pathways to prepare for college or careers, creating access for underserved or historically
6
marginalized populations, as well as quality instruction in literacy for students in early learning
programs. WSDE outlines actionable items to be addressed in five years including 66% of high
school graduates attaining a postsecondary credential or degree following high school, literacy
goals for early education students, and above 20% of the historically underserved students
achieving academic expectations within five years.
Organizational Goal
The Western State Department of Education federal programs office serves as a steward
of federal funds offered through the Every Student Succeeds Act (ESSA). The goal of the
funding is to improve education programs and services and to close education achievement gaps
at the state and local level. The purpose of the federal programs office is to interpret program
requirements outlined in ESSA and inform local districts of program requirements to ensure the
purpose of the policy is being met through implementation. The state provides guidance through
the Consolidated Application manual to school districts to help navigate statutory expectations
and compliance measures. The office also ensures that federal funds are being utilized to
implement high impact programs and services that are evidence-based in achieving academic
outcomes for targeted student populations. The targeted populations include; low-income
students, English learners, children with disabilities, children and youth in foster care, migratory
children, youth experiencing homelessness, neglected, delinquent, and immigrant students.
The goal of the WSDE federal programs unit as it relates to the problem of practice is for
100% of districts to show final approval on the consolidated application for federal funding
within the 30-day approval cycle by July 31
st
, 2022. In conjunction with this goal and the intent
of ESSA legislation, WSDE seeks to enable LEAs to use federal funding to improve the
academic and linguistic outcomes for historically underserved student populations. According to
7
the education commissioner of the WSDE, the state agency seeks to provide “equity and
opportunity for every student, every step of the way” (K. [Redacted], personal communication,
October 21
st
, 2019).
The WSDE outlines its goals with a sense of urgency that is both civic and financially
minded for the future of the state. The urgency is emphasized in the arena of job-readiness
stating that by the year 2020, three out of four jobs in the state require education training beyond
a high school diploma. According to the National Center for Education Statistics (NCES, 2015),
the average high school dropout costs over $250,000 over his or her lifetime. This figure
incorporates lower tax contributions, reliance on public health, potential criminal activity, and
welfare. Federally funded educational opportunities are intended to close achievement gaps and
create equitable outcomes for historically underserved students (ESSA, 2018).
Importance of the Evaluation
It is important to evaluate the performance of the WSDE with the agency performance
goal of 100% final approval of federal funding applications by July 31
st
, 2022 for a variety of
reasons. If a school district does not disburse funds within the period of availability of 27
months, it risks falling outside of the deadline resulting in potentially losing the funds or not
being able to carry the funds over into the next fiscal year (EDGAR, 2018). The district cannot
reserve the funds needed to cover the cost of any activity that does not receive final approval.
The federal funding timeline is aligned with the fiscal year. According to the fiscal year
calendar, July 1
st
is the new fiscal year. If the application is not approved by July 31
st
, the funds
are not accessible until final approval and must be covered through the state, local, or other
funding sources (EDGAR, 2018). From an organizational perspective, the western state ESSA
8
Consolidated State Plan outlines performance-based benchmarks for disaggregated groups,
which helps to identify schools that require targeted or comprehensive support.
It is also important to evaluate approval rates of consolidated applications to the state
agency because if the school district does not receive approval, the schools identified for targeted
support do not receive the necessary interventions from supplementary federal funding.
According to the state plan outlined on the agency’s website, schools identified with low
graduation rates and low achievement are identified for targeted or comprehensive support
programs. Two of the frameworks referenced in the Organizational Context and Mission section
include District Performance Frameworks (DPF) and School Performance Frameworks (SPF).
The frameworks include academic achievement, academic growth, and postsecondary workforce
readiness. Using performance data to track how districts or schools are aligning to expectations
from the state agency, they are assigned performance plans, improvement plans, or turnaround
plans. When a district does not receive final approval for the consolidated application for federal
funding, the school district is not able to encumber funds for support initiatives as outlined by
ESSA and the state performance framework.
If leveraging funds to create equitable outcomes for historically underserved students is
the intent of federally funded initiatives under ESSA, then the approval rate of applications by
the SEA is the starting point to achieve this goal. When a district receives final approval of their
consolidated application for federal funding, then the district can access funding for
improvement initiatives including supplies, staffing, instructional coaches, intervention teachers,
and new programming.
9
Description of Stakeholder Groups
There are two stakeholder groups involved in the study that contribute to the
accomplishment of the organizational performance goal including the WSDE federal programs
unit and the LEA responsible for submitting the consolidated application for federal funding to
the WSDE. The United States Department of Education distributes formula grant funding to
State Educational Agencies (SEAs) to provide supplemental instructional supports and
professional development opportunities. The funding is provided through the Elementary and
Secondary Education Act (1965) as reauthorized by ESSA (2015) and is divided into separate
funding streams or Titles. The WSDE is responsible for receiving funding applications for Title
programs and granting approval to districts to draw down and disperse funds for supplemental
programming activities. Once the district receives approval of an application, the district is
authorized to initiate the programming outlined in the consolidated application for federal
funding
Stakeholder Group for the Study
While each stakeholder group involved in federal funding compliance measures are
integral to the success of accepting, delivering, and dispersing federal funding to Local
Education Agencies, the goal of 100% of districts to show final approval on the consolidated
application for federal funding within the 30-day approval cycle by July 31
st
, 2022 is a
component driven by understanding and communication of objectives in the consolidated
application for federal funding to the school district. The two stakeholder groups of focus for
this study include the federal program grant review team at the WSDE and the district grant
administrators responsible for submitting the consolidated application for federal funding to the
WSDE.
10
The grant review team is composed of twenty reviewers from the WSDE. Five members
of the team are Title program consultants who review statutory and compliance indicators related
to their respective Title program. There are three grant fiscal representatives responsible for
budget item analysis and coding, four supervisors oversee operations, and eight additional
support staff are included in the process.
The individuals responsible for planning, drafting, submitting, and implementing the
LEA plan are primarily the district grant administrators. To provide educational opportunities
and professional development to students, districts must submit an application providing
assurances, a needs assessment, and a plan of action for delivering the funds for compliance
purposes. To grant approval, each school district’s application (178 districts) must meet 100%
compliance within the application.
Both the SEA and LEA are integral partners in the submission and review of federal
grant activities and compliance. Both groups have been selected because the reviewer and
submitter of the application both work closely with the application to ensure that it is compliant
and approved for funding. Grant administrators were selected from approximately 180 LEAs in
the western state and grant reviewers were selected for open-ended surveys from the WSDE.
Table 2 displays the organizational mission, global goal, and stakeholder goals for both the SEA
and LEA collectively as the goal relates to both organizations.
Table 2
Organizational Mission, Global Goal, and Stakeholder Goals
Organizational Mission
The Western State Department of Education Office of Elementary and Secondary Education
aims to evaluate how districts are using federal funding to improve the academic and linguistic
outcomes for historically underserved students.
11
Organizational/Stakeholder Performance Goal
By July 31
st
, 2022, Local Educational Agencies will show 100% final approval on the
Consolidated Application for federal funding within the 30-day approval cycle from the
Western State Department of Education.
Purpose of the Project and Questions
The purpose of this dissertation is to conduct an evaluative study to analyze the need for
Local Education Agencies to receive final approval for K-12 public education federal funding
applications. The project goal is to develop and implement training delivered by the State
Education Agency to help the LEA achieve 100% approval of the Consolidated Application for
federal funds. This evaluative goal aligns with the organizational mission of using federal
funding to improve academic and linguistic outcomes for historically underserved students. The
project employs the Clark and Estes’ (2008) gap analysis framework focusing on the knowledge,
motivation, and organizational factors essential to achieving the organizational performance
goal. As such, the questions that guide this study are the following:
1. What are the knowledge, motivation, and organizational needs necessary for the LEA to
achieve 100% approval (compliance) of consolidated applications for federal funding?
2. What are the recommendations for practice to increase approval on the consolidated
application for federal funds?
Methodological Framework
The examination of approval rate using the Knowledge, Motivation, and Organizational
(KMO) gap analysis framework will require both quantitative and qualitative data collection
implemented using the convergent parallel mixed methods model (Clark & Estes, 2008; Creswell
& Creswell, 2018). This mixed-methods approach will examine the KMO needs of district grant
12
administrators and grant reviewers at the state agency to comply with the Consolidated
Application requirements for funding.
According to Creswell and Creswell (2018), the convergent parallel mixed methods
procedure is appropriate for this research design because of the availability of data from the
primary stakeholder, the school district, and the organization responsible for reviewing the
application, the state agency. Quantitative and qualitative data will be combined to provide a
comprehensive exploration of the research question in this evaluative study (Creswell &
Creswell, 2018). The quantitative surveys will investigate how the district perceives the value of
submitting a compliant application for federal funds. Additionally, questions will focus on the
district’s perception of how the funds received from federal grants will contribute to future
initiatives for increasing linguistic and academic achievement as outlined by the Every Student
Succeeds Act (ESSA). Following the methodology of the convergent mixed-method study, the
qualitative and quantitative data investigated will be collected using parallel influences of the
knowledge, motivation, and organizational variables proposed in chapter two (Clark & Estes,
2008; Creswell & Creswell, 2018).
In conjunction with the quantitative surveys delivered to school districts, the SEA staff
who oversee the submissions and review of the consolidated application will be surveyed using
open-ended qualitative questions. The qualitative questions will be designed to determine what
organizational factors are influencing the knowledge and motivational gaps described in the
survey results from the school district (Merriam & Tisdell, 2016). The questionnaire will
consider the understanding from both the fiscal and programmatic perspective of state agency
representatives who review and oversee review processes for the consolidated application for
federal funds.
13
Definitions
This section provides definitions of frequently used terms used throughout this research
study and dissertation.
Local Education Agency (LEA): This is the acronym used for a school district or public board of
education legally authorized to deliver educational services to students. This can include Boards
of Cooperative Educational Services (BOCES) and Educational Consortiums.
State Education Agency (SEA): This is the state education agency or state department of
education responsible for educational resources, information, guidance, and administering
federal funds to local education agencies.
The Elementary and Secondary Education Act (ESEA): is federal legislation to create equal
opportunities for students. This was reauthorized in 2015 as the Every Student Succeeds Act
(ESSA).
Comprehensive Support (CS) school: ESSA requires states have a method for identifying
schools that demonstrate the lowest-performing 5% of Title I schools, high schools with low
graduation rates, and chronically low performing student groups.
Targeted Support (TS) school: Any schools with at least one consistently underperforming
disaggregated group.
Organization of the Project
Chapter one contains a summary of the impact of the inability of school districts to
submit 100% compliant consolidated applications for federal funding to SEAs. This foundation
is presented to propose an evaluative study analyzing the knowledge, motivation, and
organizational factors that influence the school district’s submission of a consolidated
application for federal funding to a state education agency. Chapter two reviews the current
14
literature surrounding the success or failure of SEAs in accepting funds under ESSA from school
districts. Chapter three describes the methodology used for this study, a rationale for the selected
sample groups, and data collection analysis and procedures. The presentation and analysis of the
research findings are presented in chapter four. Finally, chapter five presents a summary of the
findings of the study as well as conclusions and recommendations for future research.
15
CHAPTER TWO: REVIEW OF THE LITERATURE
Federal Grant Applications Approval from a State Education Agency
This literature review examines the possible causes of disproportionately low approval
rates of Local Education Agencies (LEAs) applications for federal funding to a western State
Education Agency (SEA). This chapter begins with general research on the importance of
supplemental federal funding to Kindergarten through Twelfth grade (K-12) LEAs to assist in
improving academic outcomes for historically underserved students. This is followed by an
overview of the literature on the decrease in federal education mandates allowing for more
flexibility and responsibility for the State Education Agency. This body of literature was
selected to present a comprehensive discussion on federally funded initiatives under the
Elementary and Secondary Education Act (ESEA) as well as effective practices for planning and
implementing programming for federally funded activities at the K-12 education level. To
understand gaps in submission approval of federal grant applications from an LEA or school
district to a State Education Agency, an overview of federally funded initiatives as well as the
intent of the use of funds helps to facilitate understanding of the problem of practice presented in
this dissertation. This literature review incorporates recent research on federal programming
preparation and implementation practices for both SEA and LEA K-12 education agencies in the
United States. Following the general literature is an explanation of the Clark and Estes (2008)
knowledge, motivation, and organizational influences’ framework used to examine the
disproportionately low approval rates of LEA applications for federal funding in a Western State
Department of Education (WSDE).
16
Federal Funding in Local Education Agencies
Understanding federal funding to K-12 institutions in the United States begins with the
precedent-setting legislation of ESEA. This act, as reauthorized by the Every Student Succeeds
Act (ESSA), provides supplemental support to state and local funding for K-12 educational
programming. Supplementary federal funding afforded through the legislation provides all
students opportunity to receive equitable and educational opportunities to close academic
achievement gaps. Such opportunities include revised or expanded curriculum, supplemental
salary, reduction in class size, instructional equipment additions or improvements, and other
supplemental resources to enhance learning (Rioux, 1965). Numerous research studies point to
the factors, variables, and causes that influence the importance of supplemental federal funding
for local school districts as approved by state agencies (Cook-Harvey et al., 2016; Heise, 2017;
Kainz, 2019; Wanker, 2005).
To engage in supplemental activities through federal funding, a school district is required
to submit plans that address the requirements of the law under ESSA as well as descriptions of
activities that the district will implement with its Title I, II, III, IV, and V allocations (ESSA,
2018). In 2018, the federal programs unit of a western state education agency approved 70% of
the school district applications within a thirty-day approval deadline. If district applications to
state agencies do not meet compliance leading to final approval, the intent of the funds to
achieve equitable outcomes is compromised as well as potential civil rights violations of
equitable access to education (Dennis, 2017; USDE, 2017).
Consequences of Federal Grant Funding Approval Rates
The school district’s ability to submit a compliant application has further consequences related to
equitable access to educational opportunities for historically underserved students. Supplemental
17
federal funding from the Elementary and Secondary Education Act of 1965 was founded upon
the principles of equitable education providing “all children significant opportunity to receive a
fair, equitable, and high-quality education, and to close educational achievement gaps” (ESSA,
2018). The ability to close achievement gaps in historically underserved populations can help
contribute to closing achievement gaps and stimulating the United States’ economy (Cook-
Harvey et al., 2016; Darling-Hammond, Bae, Cook-Harvey, Lam, Mercer, Podolsky, & Stosich,
2016; Thomas & Brady, 2005).
Furthermore, ESSA is designed to meet student’s civil rights to education and can be
used as a catalyst for developing and employing educational structures needed to ensure that
historically underserved students can succeed and close achievement gaps (Dennis, 2017). If the
disproportionate number of final approval rates for federal grant initiatives in this western state
does not improve, further consequences will ensue impacting the United States economy and
ultimately violating the Civil Rights (1964) legislation.
Federal Funding for Public Education
The Elementary and Secondary Education Act of 1965 was born from a tumultuous
landscape where students from poverty, highly affected race groups, and other national origins
were not receiving equitable academic outcomes. The reforms of the Elementary and Secondary
Education Act (ESEA) were rooted in the Civil Rights Act of 1964 (Rioux, 1965). According to
this landmark civil rights legislation, discrimination based on race, color, religion, sex, or
national origin was outlawed (Civil Rights Act, 1964; Dennis, 2017). According to J. William
Rioux, the deputy director of the Programs for Education of the Disadvantaged Office of
Education in 1965, ESEA (1965) presented a new horizon for the education of disadvantaged
children marking a new era in education reform. The Title programs adopted under ESEA offer
18
various supports to upgrade the general quality of educational effort and reforms in the United
States public K-12 education system. Furthermore, the congressional design behind the
legislation was to allow for more educational freedom to institute measures for underserved
students (Rioux, 1965).
Importance of Federal Funding to Public Education
Federal funding afforded through the Every Student Succeeds Act provides all students
the opportunity to receive equitable and educational opportunities to close academic achievement
gaps (ESSA, 2018). When federal funds are not implemented or obligated under their purpose
and intent, it is a direct violation of students Civil Rights as well as risks programming for
students to close achievement gaps (EDGAR, 2018; USDE, 2017). ESSA requires states to
present the four-year adjusted cohort high school graduation rates. Included in the state plan for
high school graduation rate is a disaggregated table containing subgroups and their total four-
year adjusted graduation percentage. The subgroups presented all indicate disproportionately
low high school graduation rates of all ESSA identified subgroups of students in K-12 public
education in this western state. ESSA includes new provisions that can be utilized to increase
equity for historically underserved students including underprivileged or low-income, minority,
English learners, special education, and homeless student populations (Cook-Harvey et al.,
2016). The accountability embedded in the state report card is designed to measure and report
outcomes for students utilizing federal funding initiatives (Thomas & Brady, 2005).
The Every Student Succeeds Act is currently in its third reauthorization under the Obama
administration in 2015 as the Every Student Succeeds Act (ESSA). With the adoption of the
ESSA in 2015, the federal government is presented with the ability to offer LEAs the ability to
design equitably-driven initiatives. The initiatives under ESSA are intended to prepare
19
historically underserved students for the rigorous demands of the 21
st
century. Cook-Harvey et
al. (2016) discuss the ability to leverage opportunity through federal funding streams noting the
new flexibility afforded through the reauthorization of ESEA in 2015. The authors unpack the
new provisions under ESSA that provide more opportunities to focus on higher-order thinking,
multiple measures of assessment, resource equity, and evidence-based interventions (EBIs).
Darling-Hammond et al. (2016) expand on these indicators discussing that states are required to
use multiple measures of performance and assessment data to create latitude beyond test scores
and using a body of evidence to determine academic growth. Resource equity refers to the
assistance of programming for schools identified under Targeted Assistance or Comprehensive
Support and Improvement funds (Darling-Hammond et al., 2016). These school sites receive
measures from the state to ensure that there are equal funding streams provided to all schools
while placing priorities on federally identified schools. Finally, Evidence-Based Interventions or
EBIs are activities, strategies, or interventions that are grounded in empirical evidence or
demonstrate a significant effect on student achievement (Darling-Hammond et al., 2016).
Federal funding provides supplemental supports for historically underserved students in
the public Kindergarten through the twelfth (K-12) grade system in the United States. As there
are opportunities afforded through federally funded supplemental activities in the U.S. K-12
public education system, there remains a lack of research around effective planning and
implementation of federal funds.
Lack of Current Research on Outcomes Related to Federal ESSA Funds
Education research is driven by a local or contextual perspective that can be restrictive to
the site from which the research is being conducted. For example, education research is often
concerned with identifying promising practices at specific school sites. While research is
20
available for effective programming, the identification of practice does not necessarily ensure
that the research is being implemented (Dynarski, 2015). Local research coupled with a general
lack of distribution of research efforts for program effectiveness and development is limited to
the local context from which the research is conducted. According to Dynarski (2015), the
Government Accountability Office expressed concern regarding whether educators use research
that is available and whether there is a disconnect of whether further research changes
educational practices. Under ESSA, the state agency is required to report annually upon the
progress of historically underperforming or underserved students. The research around
implementing and developing plans to enhance outcomes beyond compliance practices is
lacking.
Teachers and practitioners are arguably the most involved in policy, yet teacher policy
advocates are the least studied stakeholder in U.S. public education reform (Jones, Khalil, &
Dixon, 2017). There are resources related to the impact of policy on teachers’ work; however,
there is little understanding of how teachers interpret policy. The lack of research around teacher
interpretation can lead to a misunderstanding of how policy is implemented at the local education
agency. While guidance exists for federally funded activities, there is a general lack of research
around planning, implementing, and evaluating effective programs under ESSA (2018). Despite
these challenges, states and local school districts are required to report the performance of
targeted populations to the United States Department of Education (USDE) using state identified
metrics.
Identified Targeted Populations for Federal Funding
With the passage of ESSA, the role of the State Education Agency has shifted to
supporting schools and districts in improving outcomes for all students (Cook-Harvey et al.,
21
2016; Weiss, 2016). Among initiatives to change assessment flexibility and other accountability
measures, the state plan must also provide assurance that the state provides accountability data.
The accountability data is presented by each state to the United States Department of Education
(USDE) annually in the form of a state report card. Each state is also required to submit the
report card in an accessible format, tabulated by major student subgroups (Aragon, Griffith,
Wixom, Woods, & Workman, 2016). Despite the goals outlined by the state plan, there still are
major and persistent achievement gaps among subgroups of students. For example, the report
card submitted by the western state education agency (as required by ESSA) outlines the
performance of all K-12 students including academic achievement, academic growth, graduation
rate, and English language proficiency.
Targeted programming using Title funds help to create opportunities to close
achievement gaps in historically underserved populations of students. Therefore, Title
programming under ESSA is the lever for improving educational outcomes and opportunities for
students identified to be at a disadvantage or in academic subgroups (Darling-Hammond et al.,
2016). ESSA aims to connect policy, existing research, school processes, and educational
practices for students that attend high-poverty schools with high populations of identified
subgroup indicators (Kainz, 2019). Since the legislation’s origin, ESEA is the single largest
amount of federal fiscal funds to serve educationally disadvantaged students (Thomas & Brady,
2005). Over-time, the funding has evolved to include the needs of targeted subgroups of
students, including English learners (the Bilingual Act; Title VII), female students (the Women’s
Educational Equity Act; Title IX), and Native American students (the Improvement of
Educational Opportunities for Indian Students Act; Title X) (Thomas & Brady, 2005). Various
reform movements over time have shaped the populations for which ESSA serves to this day and
22
the student groups that are served are reported annually through the ESSA state plan. Annual
report of performance for historically underserved students could help create accountability
systems and pathways towards linguistic and academic proficiency. Accountability through state
report cards is one component of the new state flexibility afforded through the adoption of ESSA
(Weiss & McGuinn, 2016).
Federal Funding Initiatives under the Every Student Succeeds Act
The Every Student Succeeds Act has shifted the role of the state agency through
decreases in federal education mandates. The new law under ESSA shifts from prior legislation
under No Child Left Behind to offering more flexibility to states and control over education
policy (Aragon et al., 2016). ESSA also offers greater flexibility in the types of assessments that
states use for accountability measures.
Decrease in Federal Education Mandates
The No Child Left Behind Act created a new measure for state responsibility and authority
has been established over educational outcomes (Wanker, 2005). In February of 2016, a
committee on education and the workforce conducted a hearing in front of the U.S. House of
Representatives. The committee discussed the changes proposed in the Every Student Succeeds
Act legislation of 2015. Under No Child Left Behind in 2001, the federal government imposed
strict limitations and punitive actions over schools and districts in areas included, but not limited
to hiring practices of teachers, how to gauge school achievement and corrective action for
underperforming schools. State and local leaders raised concerns about the top-down approach
and were concerned about too much federal control over local education agencies. No Child Left
Behind (NCLB) laid the foundation for expanding federal regulation and the redesign in ESSA
reflects a shift in the federal role (Fránquiz & Ortiz, 2016). The redesign is possible due to the
23
change in landscape over the last twenty years of the state’s role in federal
programming. Opposition voices portrayed NCLB as an intruding force upon state and local
control of education resulting in a flip of accountability from federal to the state level
(McDonnell, 2005).
Heise (2017) argues that structural changes put into place by NCLB were undone by ESSA
which repositioned significant federal education policy control as the state’s
responsibility. Before 2001, the imposition of federal control was the most extreme version of
the federal government’s imposition into public school policy in the United States (Ellerson,
2012). After ESSA, there is a broader issue of federalism components that invite policy-making
power from the state to create tension from regulation to the market value of education
commodities (Heise, 2017). The strong opposition of NCLB allowed for a redesign of the ESEA
shifting federal control to the state and local education agencies. Based on Heise’s (2017)
argument, the role of the state continues to transform under the new ESSA legislation forging
new opportunities for state reporting, design, and implementation.
Role of the State Education Agency in Federal Funding
With the shift of accountability to the state level, the state education agency has
transferred to supporting schools and districts in improving outcomes for all students. Aragon et
al. (2016) discuss state responsibility after NCLB as a greater emphasis on reporting progress of
subgroups or identified targeted populations. Flexibility in accountability systems influence
school improvement initiatives as well as funding. According to the authors, it is predicted that
schools identified under comprehensive improvement will drop from a current 50,000 to a
projected 4,000 schools. The state education agency is required to submit a state plan to the US
Department of Education in consultation with the state board of a variety of stakeholders
24
(representatives of Indian tribes in the state, teachers, principals, charter school leaders,
specialized instructional support personnel, paraprofessionals, administrators, other staff and
parents) in a timely and meaningful manner.
Fránquiz and Ortiz (2016) continue that the shifting of power in designing and
implementing programs from the federal government to the state and local level. Because ESSA
shifts the authority back to state and local levels, the authors recommend communities to
organize to ensure that students are receiving the supports that they need and feedback from all
stakeholders included in this process. ESSA is a shift away from accountability measures in No
Child Left Behind under Annual Yearly Progress measures (AYP) ushering in a new era where
successes and failures are not based on assessment data alone. ESSA calls for a body of
evidence or multiple measures of success to monitor learning and improvement efforts rather
than its predecessor’s AYP measure (ESSA, 2018).
Under ESSA, federal accountability mandates are decreasing creating more flexibility for
states. With increased flexibility, states are granted additional obligation to improve outcomes
for every student the state serves. State administrators of federal funds are not only stewards but
focus on compliance with federal compliance, state statute, and focusing on leveraging funds to
improve. Weiss and McGuinn (2016) recommend reexamining the role of state education
agencies due to the new responsibilities under federal compliance. Weiss and Mcguinn (2016)
further argue that federal accountability mandates have shifted creating additional flexibility for
the state educational agency to facilitate change and improve outcomes for all students. The new
responsibility of the state for improvement efforts ushers in a new era of planning and
implementing state funds from the state level.
25
Planning and Implementation of Federally Funded Activities
Federally funded activities present challenges as well as solutions for school districts
seeking approval of formula grant applications to meet compliance and plan effective activities
for students. The Every Student Succeeds Act supports local innovations through Evidence-
Based Interventions (EBIs). ESSA requires state and local education agencies to provide
information to educators and communities regarding annual statewide assessment and measures
towards progress (USDE, 2017). Federal mandates maintain an expectation that accountability
and action are taken to affect change in the lowest-performing schools and with students and
targeted subgroups that are not showing progress (ESSA, 2018).
Local Education Agency Policy and Budget Decisions
Cuts to state and local education funds affect general programming and create tension for
supplanting and new initiatives proposed using federal funds. In response to budget cuts in
2008, The American Association of School Administrators (AASA) began to collect studies
measuring the impact of the economic downturn on schools. The survey of 528 administrators
from 48 states reported a dramatic cut in state and local operating budgets or more than half of
the states reported budget cuts. Federal funding creates an opportunity to supplement decreasing
general funds (Ellerson, 2012). Due to the local control nature of the western state governing
policies, it is the decision of the local elected or appointed representative of the institution to
determine the goals, directives, outcomes, and budgetary decisions of the public schools under
their purview (Edglossary, 2014).
Jiménez-Castellanos et al. (2019) investigated policies that impact educational decisions.
The researchers discovered that policies impact educational decisions and are a collection of
values that are brought forth through power relations by various political actors. Because of this,
26
it can be extrapolated that politics and education are implicitly connected. The authors argue that
local state control creates educational funding formulas that perpetuate power dynamics and
stifle school improvement. While there are barriers created from federal policy and budget cuts,
further solutions have arisen from the flexibility of state plans submitted to the United States
Department of Education.
Local Education Solutions for Approval Process
To receive funding, local school districts submit plans to the state department for
approval of funds to be used in the local context. The new initiatives outlined under ESSA
create long-term goals for students and LEAs and a more supportive role for SEAs (Mathis &
Trujillo, 2016). The state system could create a meaningful consultation and categorizations for
identifying schools to receive additional targeted and comprehensive support. Darling-
Hammond et al. (2016) explore the new pathways to reexamine and define accountability
systems and reporting to the United States Department of Education. State accountability
indicators include a new allowance for measures of school progress, measures of school
identification for support and intervention, and evidence-based intervention initiatives.
Title I is designed to help underachieving schools through targeted support and
improvement efforts. Borman (2000) documents the effectiveness of Title I interventions and
possible outcomes for new practices utilizing Title funding. The study examines how developing
and replicating effective programs could benefit other school districts applying for funding.
According to the researcher, Title I is a symbol of the government’s commitment to equitable
services for students. Title funding provides this opportunity for students to break existing
achievement gaps. State plans allow for new and unique measures of academic progress using
state selected metrics towards improvement for underperforming schools and targeted student
27
groups. Borman (2000) argues that developing programs that meet the compliance measures
outlined by ESSA is the catalyst for creating effective supplemental programming using federal
funds. This is the ultimate goal of the stakeholder of the study in achieving 100% final approval
on federal applications to allow the school district the space to develop and enhance
supplemental programs for historically underserved students (N. [Redacted], personal
communication, February 21
st
, 2019).
Role of Stakeholder Group of Focus
Lewis (2011) situates stakeholder theory around how organizations allocate and draw
attention to various recognized attributes of their work. For example, Lewis (2011) contends that
stakeholders generally hold three attributes that influence their role within larger systems
including power, legitimacy, and urgency. As defined by the United States Department of
Education in the ESEA act of 1965, a Local Education Agency is a public authority legally
constituted within a state for either administrative control or direction of its public elementary
and secondary schools. This study is focused on the relationship of the Local Education Agency
to the State Education Agency. In the context of this western state’s educational system, an LEA
operates by the direction of the state department of education but is authorized to exert and
enforce its local control through the governing and management of public schools through
appointed representatives (USDE, 2017).
As proposed by Lewis (2011), the power relationship that exists is the ability for the
stakeholder to impose their will for what they believe is the right direction for their federal funds.
According to ESSA, it is the state’s responsibility to maintain if the school districts proposed
activity meets statutory compliance, which contributes to the legitimacy of the district’s proposal
for federal funds. Finally, Lewis (2011) invokes the sense of urgency demonstrated by the role
28
of the stakeholder. In this case, the timeline to propose, implement, obligate, and draw down
funds are all attached to the deadlines assigned by the United States Department of Education
and state requirements for administering and delivering grant awards of formula funds (EDGAR,
2016). According to the western state ESSA plan, the goal of the department is to shift the
country’s education law from relying on federal oversight to provide states additional flexibility
and ultimately more decision-making power at the local level. The stakeholder group for this
study is the LEA, which exerts its power, legitimacy, and urgency when applying for and
administering federal funds from the state department of education.
Clark and Estes’ (2008) Knowledge, Motivation and Organizational Influences Framework
Clark and Estes (2008) suggest a gap analysis approach for diagnosing stakeholder
performance gaps in organizations by using three key influences including knowledge,
motivation, and organizational gaps. This investigation begins by addressing the gap(s) that
exist within the knowledge influences impacting an organization’s objectives. Drawing from
Krathwohl’s (2002) four dimensions of knowledge and skills, the study assessed possible
knowledge gaps that exist within the factual, conceptual, and procedural, and metacognitive
knowledge categories. The second of the three influences to examine are motivational issues.
Motivation is a context-dependent mindset that initiates and sustains behaviors influencing goal-
directed outcomes (Mayer, 2011; Rueda, 2011). Finally, influences such as cultural settings,
models, resources, and practice are examined to determine performance gaps contributing to
organizational issues (Clark & Estes, 2008).
For the Western State Department of Education (WSDE) to reach its performance goal of
100% final approval of applications for federal funds, this study employs a modified gap analysis
of the types of knowledge, motivation, and organizational influences need to be addressed using
29
the Clark and Estes’ (2008) framework. The first section will be a discussion of the knowledge
influences involved in planning, developing, and implementing programs outlined in the
consolidated application for federal funding. Second, an examination of the implicit motivation
involved in the Local Education Agency’s ability to submit an approved application for K-12
educational federal funds. Third, the organizational influences impacting the performance goal
will be investigated. The knowledge, motivation, and organizational influences will be further
explored through the methodology section in Chapter 3.
Stakeholder Knowledge, Motivation, and Organizational Influences
This review of the current literature focuses on the knowledge, motivation, and
organizational influences that a Western State Department of Education (WSDE, pseudonym)
needs to focus on to achieve its stakeholder performance goal. The performance goal of this
western State Educational Agency is to grant 100% final approval to federal funding applications
submitted by Local Educational Agencies by September of 2022.
For the WSDE to reach its performance goal and close a performance gap, the types of
knowledge and skills involved in the process of application submissions need to be addressed
(Clark & Estes, 2008). According to Clark and Estes (2008), there are three causes of
performance gaps including knowledge, motivational, and organization issues. The authors
employ a gap analysis approach to diagnosing and solving performance issues within
organizations and start with determining what gaps exist within knowledge and skillsets. The
first step is identifying whether stakeholders know how to accomplish their performance goals.
Solving problems connected to performance is directly linked to the knowledge and skills
required to conceptualize, analyze, and execute challenging tasks.
30
Knowledge and Skills
Knowledge influences can be divided into four categories as a result of an addition by
Krathwohl (2002) of a fourth influence. Using Bloom’s Taxonomy of Educational Objectives,
Krathwohl (2002) expanded knowledge influences to include factual, conceptual, and procedural
knowledge categories with the addition of metacognitive knowledge. Factual knowledge refers
to the basic elements of the content including facts and terminology related to what is being
learned or demonstrated (Krathwohl, 2002). Conceptual knowledge relates to the variety,
principles, implications, structures, interrelationships, and differences between various
classifications and categories that exist within a content area (Krathwohl, 2002; Mayer, 2011).
Procedural knowledge refers to the skills and techniques involved in a process to achieve the
desired outcome (Krathwohl, 2002; Mayer, 2011). The metacognitive category investigates the
thought processes involved in reflection and evaluating the strengths and challenges associated
with planning, monitoring, and adjusting strategies to accomplish performance goals (Krathwohl,
2002; Baker, 2006).
There are three types of knowledge influences in this study. The three types of
knowledge influences that will be investigated include factual, conceptual, and metacognitive
types (Krathwohl, 2002; Mayer, 2011). Factual knowledge includes the stakeholder’s
understanding of reporting requirements under the Every Student Succeeds Act (ESSA). Under
ESSA, states are required to report educational performance data from historically underserved
students. The local school district is charged with the task of reporting this data to the state
department of education (Thomas & Brady, 2005).
The following sections address three knowledge influences of the LEAs submission
process of federal funding applications with each influence categorized referencing the
31
discussion above. Classification of the knowledge influences facilitates an understanding of
determining assessment methods and benchmarking criteria towards the stakeholder meeting the
performance goal of final approval of federal grant applications.
Developing knowledge of the consolidated application for federal funding. The first
knowledge influence that a school district needs to achieve its performance goal is a strong
foundation of the declarative knowledge of the components of the consolidated application for
federal funding. Declarative knowledge falls under the factual category described above from
Krathwohl’s (2002) framework. The basic understanding of the federal application falls under
the category of declarative knowledge as it follows the facts and information essential for
understanding the rest of the process of completing and submitting the federal funds application
for K-12 public schools. This information also emphasizes the element of training that
determines what and how training is conducted. Once these features are understood, further
technical skills can be developed and enhanced (Aguinis & Kraiger, 2009).
The Elementary and Secondary Education Act of 1965 as reauthorized by ESSA in 2015
informs states of their obligations under federal law to obligate funds to LEAs on an annual
basis. States disperse the funding with the assurance that the grants are awarded promptly while
ensuring that the plan for funds warrants positive outcomes for students. Declarative knowledge
of the consolidated application involves the basic functions of each piece of the application. The
application serves as the statutorily required LEA State plan under federal law. Therefore, the
LEA must understand the components of this application. Once an LEA or school district
receives final approval, the district may obligate funds for the activities selected (hiring new
staff, implementing programs, purchasing equipment, etc.) using supplemental federal funding.
32
Incorporating a comprehensive needs assessment into planning for federal funds.
The second knowledge influence that school districts need to address to achieve their
performance goal is incorporating a needs assessment into the planning for federally funded
activities. According to Krathwohl (2002), this knowledge type falls under the procedural
category. Procedural knowledge influences the processes and procedures involved in the
planning, development, and implementation of a state or local plan for federally funded
education initiatives. The comprehensive needs assessment is outlined in the statute of ESSA
and is a requirement for every LEA submitting funds (ESSA, 2015). Grossman (2011) reviews
recent scholarly research regarding the transfer of training to help uncover transfer issues related
to training and the transfer of the desired training to learning outcomes. Local Education
Agencies are asked to create a comprehensive needs assessment through a state-developed
unified improvement plan (UIP) or another measure developed by the local agency. This piece
of the process is instrumental in developing meaningful activities that align with the needs of the
district. This is typically developed through a root cause analysis of the current issues that exist
and is based upon training delivered by the state department of education. The state training
creates a space for the local school district to learn the procedural knowledge necessary to
accurately incorporate activities to address the underlying needs determined through the needs
assessment.
The needs of the district are developed through what Kirshner et al. (2006) describe as
schema or the framework of categorization of information and how to apply the information
elements to a given task. To conduct a comprehensive needs assessment, the State Education
Agency needs to identify and develop schema around the components of understanding what
root causes could lead to performance issues and understanding the information provided
33
through training. Because federal mandates have deferred more accountability to state and local
control, state and local agencies are afforded new flexibilities for innovative activities to meet
academic outcomes for historically underserved students (Weiss & McGuinn, 2016). Once the
schema is organized and developed, it can be operationalized into identifying the procedural
knowledge types of information needed to complete the consolidated application for federal
funds from WSDE to LEAs.
Reflecting on effectiveness of federally funded programming. The third knowledge
influence Local Education Agencies need is to reflect on the effectiveness of federally funded
programming initiatives. This knowledge influence is a metacognitive influence that is affected
by the participant's understanding of their cognitive processes (Baker, 2006). When granting
federal funds to school districts, the state’s role is to determine if an activity granted for funding
falls under compliance and allowable activities under federal statute. Many districts apply for
funding through Title programs and meet compliance, but compliance does not ensure that the
funds are leveraged in a way to create the best opportunities for students. ESSA ensures LEAs
new avenues of federal mandates that reexamine the planning and delivery of instruction
utilizing new accountability measures. For an LEA to strive for better practices, further
measures to ensure quality programming would need to be instituted. Part of this process
involves awareness of how the activities are functioning within the goals of the district.
Baker (2006) operationalizes and defines metacognition processes by exploring memory
and attention as part of the learning process. Metacognition occurs when the learner is aware of
the cognitive processes involved in learning. Mayer (2011) separates metacognition into two
distinct components including awareness and control. Awareness involves knowing how one
learns and control involves monitoring and controlling learning. As districts become more
34
familiar with the instruments and effectiveness of the different funding streams available through
federal funding, the district can begin to monitor and control the planning and implementation of
the activities to generate desired outcomes that are both compliant with federal mandates and
effective for student learning targets. The school district will in turn be asked to monitor their
programming using Mayer’s (2011) framework of comprehension monitoring. Baker (2006)
describes this metacognitive process as the ability to understand the cognitive demand of the task
at hand as well as how to effectively carry out a task, approach the task, and execute. This will
help inform the state of how well the local school district understands the different components
of federal funding and lead to more effective outcomes for programming and student learning.
WSDE’s mission to ensure federal funds improve academic and linguistic outcomes for
historically underserved students is directly aligned to the goal of collaboration and
understanding of effective federal fund initiatives. Table 3 illustrates the organizational mission,
performance goal, and stakeholder goal that is associated with the knowledge influence, type,
and assessment discussed above.
Table 3
Knowledge Influences, Knowledge Types, and Knowledge Assessment
Organizational Mission
The Western State Department of Education Office of Elementary and Secondary Education
aims to evaluate how districts are using federal funding to improve the academic and linguistic
outcomes for historically underserved students.
Organizational/Stakeholder Performance Goal
By July 31
st
, 2022, Local Educational Agencies will show 100% final approval on the
Consolidated Application for federal funding from the Western State Department of
Education.
Knowledge Influence Knowledge Type Knowledge Influence
Assessment
LEA needs declarative
knowledge of the elements and
Declarative
(Conceptual)
The LEA will be provided with
survey data to identify
35
sections of the Consolidated
Application for federal funding
in the western state K-12
public education system.
familiarity with the consolidated
application manual for federal
funding.
LEAs need to know how to
incorporate a comprehensive
needs assessment into activity
plans for Title funding.
Procedural LEAs submit a sample proposal
to demonstrate how the LEA
will develop and implement
programming with federal
funding for the upcoming school
year.
LEAs need to know how to
self-reflect on the effectiveness
of their federal funding
programs.
Metacognitive LEAs will be surveyed to
monitor their federal program
initiatives in the consolidated
application to ensure self-
reflection that leads to
metacognition.
Motivation
Clark and Estes (2008) discuss motivation, the second component to examining
performance issues and to conduct a gap analysis. Mayer (2011) defines motivation as a driving
force that initiates and sustains behavior-influencing and goal-directed outcomes. He defines the
four components of motivation as personal, activating, energizing, and directed which are all
components of an individual’s overall effort to engage in a task or activity. The four components
of motivation outlined by Mayer (2011) also guide a working understanding of the influences or
limits of a stakeholder’s motivation to engage in any given subject matter. For this investigation,
motivation is examined within the Local Education Agency’s ability to submit an approved
application for federal funding for K-12 public education.
Pintrich (2003) offers a multidisciplinary approach to motivational science that focuses
on the multifaceted nature of human behavior and ideas to address motivation with research-
based methodologies. He identifies adaptive attributions and control beliefs as motivating
factors for stakeholders as well as goal setting as a motivational influence. Rueda (2011) draws
36
attention to the notion that motivation is also heavily dependent upon the context or environment
in which the individual participates. Drawing on frameworks from Pintrich (2003) and Rueda
(2011), motivational beliefs are then generally constructed based upon the stakeholder’s
interactions. Much of the LEA’s understanding of federal policies is generated through
interaction with the state education department and therefore, this interaction can be very
dependent upon motivational outcomes. Furthermore, developing and enhancing effective
programs using supplementary federal funds can strengthen the partnership between the state and
local school districts (Borman, 2000).
The two theories or constructs that are relevant to the performance goal of an LEA
reaching final approval on federal funding applications are attribution theory and self-efficacy
theory. Attribution theory is related to internal and external factors that affect task completion,
whereas self-efficacy is the individual’s belief that he or she can complete the task (Bandura,
2005; Anderman & Anderman, 2006).
LEAs should attribute their success and failure to their own efforts. The first
motivational influence that the LEA needs is stakeholder goal attribution theory. Attribution
theory occurs when a learner attributes success in a task to internal factors related to ability or
effort and external factors such as the task difficulty or luck. Factors attributing to failure
include a lack of ability, lack of effort, or poor instruction (Anderman & Anderman, 2006).
Weiner’s model of attribution is situated around how causal beliefs are formed and helps to
determine what environmental factors and background variables affect attribution choices with
stakeholders. To understand a school district’s potential attribution for unsuccessful
consolidated application for federal funding attempts, questions will be framed around successful
37
outcomes being contingent on other factors involved including external, internal, or
environmental factors.
Additionally, in precedent-setting research, Kelley and Michela (1980) discuss the idea of
perceived causation in reference to how attributions influence behaviors. The authors posit that
both antecedents and consequences are determinants of behaviors. Antecedents refer to the
cause before the behavior with direct reference to internal and external causes. Perception of
status from a state agency to a local education agency may create an external cause for
compliance based on perceived power relationships between both agencies. Heise (2017)
discusses this power tension that exists from the policy-making of the state to the regulation of
the local agency. The district may consider their performance on the state funding application as
a direct result of the reviewer from the state agency rather than considering that the performance
is more intrinsic to the LEAs ability to design and implement effective federal plans for funding.
LEAs need to believe in their capacity to receive final approval. The second
motivational influence that WSDE needs to achieve its stakeholder goal is self-efficacy. Pajares
(2006) discusses the concept of self-efficacy as a central aspect of social cognitive theory. Self-
efficacy occurs within judgments that individuals carry around their abilities to perform the
actions to produce favorable outcomes. According to Pajares (2006), an individual’s behavior is
determined by the innate belief that one can accomplish a task. Bandura (2005) describes self-
efficacy as a construct of an individual’s environment. If an individual demonstrates higher self-
efficacy, their ability to perform an activity will potentially increase into a favorable outcome.
This can be interpreted as a positive or negative self-efficacy continuum or spiral where an
individual can demonstrate positive self-efficacy resulting in more effort, learning, persistence
38
resulting in higher self-esteem and achievement. The inverse relationship of this results in a
negative spiral or achievement and self-efficacy.
Researchers have established that self-efficacy views and performance outcomes are
directly aligned with the self-efficacy beliefs and behaviors of an individual. Pintrich (2003)
discusses adaptive self-efficacy in his work around student motivation. Pintrich (2003)
determined that adaptive self-efficacy and competence are motivating to participants. When
individuals expect to do well, they tend to work harder towards their goal, persist in their efforts,
and generally perform better.
Table 4 shows the organizational global goal, stakeholder goal, and two assumed
motivational influences. The table also outlines assessment tools for the perceived motivational
influences of the stakeholder groups.
Table 4
Motivational Influences and Assessments for Analysis
Organizational Mission
The Western State Department of Education Office of Elementary and Secondary Education aims
to evaluate how districts are using federal funding to improve the academic and linguistic
outcomes for historically underserved students.
Organizational/Stakeholder Performance Goal
By July 31
st
, 2022, Local Educational Agencies will show 100% final approval on the
Consolidated Application for federal funding from the Western State Department of Education.
Assumed Motivation Influences Motivational Influence Assessment
Attribution Theory
LEAs should feel that approval of grant
applications is based on their ability to submit a
compliant grant application rather than the
reviewer’s comments.
Grant administrators need to believe that
receiving the grants will improve academic
Written survey item (Strongly disagree-
Strongly Agree, 1-5 Likert Scale)
“My success on the application is dependent
on my knowledge of the consolidated
application for federal funding.”
39
performance/help close achievement gaps for
traditionally underserved populations.
“My success in the application is dependent
on the reviewer.”
Self-Efficacy
LEAs need to believe they are capable of
submitting an application for federal funding and
receiving approval upon the first submission.
Written survey item "I am confident in my
ability to submit an application for federal
funding that receives substantial approval”
(strongly disagree- strongly agree)
Open-ended survey item:
“To what degree do you feel confident in
your ability to submit an application for
federal funds?
“How confident are you in achieving
educational goals for meeting the needs of
your LEA using federal funding.”
“Share your organizational goals for
increasing student performance and closing
educational gaps in your LEA.”
Organization
Due to its multifaceted nature, the literature suggests multiple definitions and
understandings of culture as it relates to organizations. Researchers have described a culture in
an organizational setting as the culmination of beliefs, initiatives, viewpoints, sentiments, and
learned procedures. The accumulated shared learning of a group culminates in the structures,
principles, and values that are contextually related to the environment in which the learning
occurs (Berger, 2014; Kezar, 2001; Schein, 2017). When knowledge and motivation influence
have been ruled out, organizational influences can be the source of barriers within an
organizational structure (Clark & Estes, 2008; Schein, 2017). Rueda (2011) asserts that
organizational factors contribute to the gap analysis framework by diagnosing the goals of an
organization and evaluating outcomes related to the established goals.
40
Culture shapes the core values and norms of its members. Gallimore and Goldenberg
(2011) present two frameworks in which to understand organizational culture. The first
framework is a cultural model, which refers to the shared understanding of conceptual schemas
and practices that contribute to greater global goals within an organization. Schein (2017)
presents the notion of invisible and visible models that exist within cultures. He continues to
posit that culture is the culmination of what a group learns over a period of time. These
characteristics surface in visible patterns and, rituals, and traditions and also through more
implicit models that are tangibly visible. Cultural models are generally invisible and contribute
to trust in an organization. The second is the cultural setting which is a visible and concrete
manifestation occurring within the social environment in which work is conducted. The cultural
setting is visible and is often tied to performance goals and feedback (Gallimore & Goldenberg,
2001).
The organization needs a culture of collaboration. Culture is a dynamic model existing
within visible and invisible patterns of systems. The interplay of visible and invisible systems
dynamic exists where groups or organizations solve problems on an external level integrating
internal and invisible systems (Erez & Gati, 2004; Schein, 2017). This interplay can also exist
within two entities working together for one outcome. In this case, the culture between WSDE
and local school districts is one forged through trust and a mutual interest in creating equitable
educational opportunities for historically underserved groups. The Elementary and Secondary
Education Act (ESEA) is the primary connection for the interaction of the two agencies. ESEA
as reauthorized by the Every Student Succeeds Act (ESSA) grants supplementary supports to
provide educational opportunities provided by ESSA federal funds (Rioux, 1965). Gallimore
and Goldenberg (2001) discuss cultural models and settings that largely determine greater
41
discourses in educational policy. This landmark federal legislation is one that impacts multiple
models of culture from the United States Department of Education to the student receiving the
service. The interplay of cultural models and settings combine to achieve the stakeholder goal of
100% compliance of federal funding to create additional programming to close achievement gaps
for historically underserved students. Compliance of federally funded applications is formed
through collaborative partnerships between WSDE and school districts built upon a foundation
of trust (Bensimon & Neuman, 1993; Hill, Kogler, & Keller, 2009).
The organization needs a culture of trust. Trust within an organization is an invisible
element related to cultural models that are currently in place within the existing structures of an
organization. According to the literature, trust is built or undermined through encounters and
exchanges with coworkers and superiors. In this case, trust is fostered between the state and
local agency contexts. This trust is facilitated and evaluated through interactions between the
two entities. Clear communication of expectations between the state and local school districts
will help facilitate trust as the decision-making process is shared and developed in a
collaborative effort (Agocs, 1997; Korsgaard & Whitener, 2002).
ESSA aims to connect policy, research, and new supplemental practices for students with
high populations of low-performing targeted groups of students (Kainz, 2019). Both the school
district and state agency have vested interests in reporting the data collected for low-performing
targeted populations, it is important to foster a culture of trust between the two participating
agencies to produce positive outcomes for students. At the root of this collaboration lies trust
between the state and school districts. A key component of collaboration lies in engagement
through both parties involved.
42
The organization needs time to engage in training opportunities offered. Training is
necessary for the integration of solutions within an organization. According to the Clark and
Estes (2008) framework, training can be diagnosed through the gap analysis and diagnosis of
knowledge barriers. The structural and institutional changes at the macro and micro levels occur
as a result of the transfer of training to provide positive organizational outcomes. Berbary and
Malinchak (2011) explore the role of engagement in government organizations. Engagement
occurs when employees take responsibility for the organizational mission. Engaged workers are
often more adept with the utility and conceptual understanding of their work and how it relates to
the ultimate goal.
In the context of federal funding, engagement in improvement accountability initiatives
will be a direct result of state agency working closely with local school districts to plan,
implement and design outcomes under the new flexibility of federal accountability mandates
(Weiss & McGuinn, 2016). Survey data is an excellent way for organizations to diagnose how
they are performing relative to others, which will help determine the needs and specific
circumstances of each local agency participating in the survey (Berbary and Malinchak, 2011).
Engagement in training activities fosters positive outcomes for underachieving and historically
underserved students using federal funding.
LEAs need peer districts to model promising practices for implementation. As
delineated by Ozcan (2008), peer benchmarking provides insight into how an organization is
functioning relative to other organizations within its market. Each quarter, the federal programs
office of the state agency provides regional network meetings to every school district in the state.
To determine the effectiveness of this technical assistance, peer benchmarking will play an
integral role in determining the effectiveness of the training.
43
As mentioned earlier, the state has a new role of reporting the progress of targeted
populations to the federal government and therefore needs to work much closer with the school
districts that it serves to help understand the accountability measures in place to develop a body
of evidence to monitor learning and improvement efforts (Aragon et al., 2016). Under ESSA,
state administrators are not only stewards of federal funds but focus on leveraging funds to
improve outcomes for students (Weiss & McGuinn, 2016).
The blending of cultural models and cultural settings from the state agency encourages
school districts to carefully craft plans to address academic achievement that rise above
compliance to effective practices for students. Table 5 presents the organizational mission,
performance goal, and stakeholder goal that is associated with the organizational influence, type,
and assessment discussed above.
Table 5
Organizational Influences, Organizational Types, and Organization Assessment
Assumed Organizational Influences
Organizational Influence Assessment
Cultural Model Influence 1:
LEA needs a culture of
collaboration.
Quantitative demonstration of final approval
improving at the LEA level. Qualitative:
Survey to determine if review comments are
helpful and if the LEA has reflected the
changes recommended from the SEA in their
application.
Cultural Model Influence 2: The
organization needs a culture of
trust.
Survey questions to determine trust
relationships between the SEA and LEA to
see if there is a demonstrable change over
time and training efforts.
Cultural Setting Influence 1: The
organization needs to provide time
to engage in training opportunities
offered.
Survey questions about how much time has
been spent on training opportunities offered
by the SEA.
44
Cultural Setting Influence 2: The
organization needs peer districts to
model promising practices.
Survey questions about regional networking
outreach and its effectiveness with integrating
peer models into the LEA’s consolidated
application for federal planning.
Conceptual Framework: The Interaction of Stakeholders’ Knowledge and Motivation and
the Organizational Context
Conceptual frameworks serve as the scaffolding or framework of a study employing
concepts to be operationalized and tested (Merriam & Tisdell, 2016). This study offers a
conceptual framework as a visual and written narrative to present the theories, variables, and
models as well as the relationships between them (Maxwell, 2013). According to Creswell and
Creswell (2018), a theory in a mixed-methods study provides an orienting lens to craft the
questions, participants involved, data collections, and overall implications. The theory is used to
present an overarching perspective when used in research design.
In the prior sections, the knowledge, motivation, and organizational influences are
presented independently to provide context and to illustrate how the potential influences
presented are interconnected. The interaction of the potential influences informs the conceptual
framework presented in this design. The conceptual framework will serve to connect and
systematize the interrelated theories that exist within the potential influences (Creswell &
Creswell, 2018). The visual and narrative of the conceptual framework demonstrate the
relationships that exist between the knowledge, motivation, and organization factors necessary to
achieve the organizational goal.
The stakeholder group of focus in this study is the Local Education Agency grant
administrators responsible for submitting the consolidated application of federal funds to the
State Education Agency. For this study, the SEA is referred to as the Western State Department
of Education (WSDE, pseudonym). The figure illustrates how the knowledge and motivational
45
influences interact to achieve the WSDE goal of 100% compliance of submitted grant
applications for federal funding by July 31
st
, 2022.
Figure 1. Interactive Conceptual Framework for Federal Funding to Local Education Agencies.
To achieve the stakeholder goal of 100% approval of applications for federal funding
from WSDE to the local school district, there needs to be a collaboration between the state and
local education agencies. Both WSDE and school district are instrumental in this process by
following federal guidance and support to ensure that the intent of the Every Student Succeeds
Act funds are met. Grant administrators ensure the intent of the law by drafting ESSA plans that
include equitable educational opportunities in both improving educational programs and closing
education achievement gaps (EDGAR, 2018). Federal guidance is used by the district and
WSDE reciprocally to monitor progress towards effective development and implementation of
the activities proposed through the consolidated application for federal funding (Baker, 2006;
Stakeholder Goal of 100% approved consolidated application
from LEA to SEA.
State Education Agency
Influenced by Cultural Settings and
Cultural Models
Local Education Agency
Influenced by Knowledge and Skills
(Metacognitive, Procedural)
Motivation
(Self-Efficacy & Attribution)
46
Mayer, 2011; Senge, 1990). This study will measure the procedural and metacognitive aspects
involved in planning and implementing federal funding at the local school district level. The
ESSA plan for federal funding for this western state is provided through the consolidated
application by each school district. The exchange of information is primarily shared between the
district and WSDE through the consolidated application for federal funds and training
opportunities offered by WSDE. Therefore, the study will measure the declarative knowledge of
the elements and sections of the consolidated application as well as developing a comprehensive
needs assessment.
Additionally, the LEA needs working knowledge of how to submit an application that
will receive final approval upon submission. If the district is able to demonstrate higher self-
efficacy, their ability to complete a compliant application for federal funds increases into more
favorable outcomes (Bandura, 2005; Pajares, 2006). Self-efficacy is determined through the
working relationship between the school district and WSDE through training and technical
assistance offered throughout the development and implementation of local school district’s
applications for federal funding. This culminates in an ESSA plan where the LEA determines
how to best improve activities aimed to increase student achievement for historically
underserved groups as well as improving the quality and effectiveness of educators (ESSA,
2018). This study will measure the LEA’s ability to attribute success in approval of the
application to their abilities and not to the perceived power relationship that exists between
WSDE and the local school district (Anderman & Anderman, 2006; Pintrich, 2003: Rueda,
2011). The commitment between WSDE and local school districts to plan federally funded
activities ensures programming that goes beyond compliance to effective programming and
implementation.
47
The intersection of the knowledge and motivation factors presented above foster the
environments in which a culture of trust can exist between the LEA and WSDE. Trust is created
through the interactions between the two agencies through clear communication of expectations
and engagement in a collaborative decision-making process (Agocs, 1997; Korsgaard &
Whitener, 2002). The district’s willingness to engage in and accept feedback from the WSDE
further enhances the partnership in training and creates positive outcomes for underachieving and
historically underrepresented students (Erez & Gati, 2004; Schein, 2017). Senge (1999) offers
the notion of creative tension by illustrating that organizations can recognize their current reality
and create a tangible vision for future planning. Senge (1999) further contends that this is a
crucial aspect of the collaborative decision-making process in forging a collaborative vision for
two organizations working together. Paying attention to the KMO potential influences as an
iterative and interactive process as illustrated in the conceptual model helps to operationalize the
factors necessary in increasing approval rates for federal applications.
Summary
This evaluation study examines possible root causes of the disproportionately low
approval rate of Local Education Agencies applications for federal funding in a western State
Educational Agency. To inform this study, this chapter reviewed literature related to
supplemental federal funding for Kindergarten through Twelfth grade (K-12) Local Education
Agencies to improve educational outcomes for historically underserved students. This review
also provided an overview of literature related to the decrease in federal education mandates
allowing for more flexibility as well as responsibility for the oversight of the state agency.
While the literature presented a comprehensive discussion of federally funded initiatives under
the Elementary and Secondary Education Act (ESEA), there is still a gap in the research as it
48
relates to effective practices for planning and implementing programming for federally funded
K-12 education agencies in the United States. After the general literature review, this chapter
turned to Clark and Estes’ (2008) knowledge, motivation, and organizational influences’
framework used in this study to examine the disproportionately low approval rates of LEA
applications for federal funding in WSDE. The next chapter will present the study’s
methodological approach.
49
CHAPTER THREE: METHODS
The purpose of this dissertation is to conduct an evaluative study to analyze the need for
Local Education Agencies to receive final approval for K-12 public education federal funding
applications. The goal is to develop and implement training delivered by the State Education
Agency to help the Local Education Agency achieve 100% approval of the Consolidated
Application for federal funds. This evaluative goal aligns with the organizational mission of
100% compliance with federal funding consolidated applications within a three-month review
period. The project will employ the Clark and Estes’ (2008) gap analysis framework focusing on
the knowledge, motivation, and organizational factors essential to achieving the organizational
performance goal.
The questions that guide this study are the following:
1. What are the knowledge, motivation, and organizational needs necessary for the LEA to
achieve 100% approval (compliance) of consolidated applications for federal funding?
2. What are the recommendations for practice to increase approval on the consolidated
application for federal funds?
This chapter reviews the research design and methods for data collection and analysis.
The chapter begins with an analysis of participating stakeholders. The stakeholder population is
described as well as the criteria used to select the participant is identified. The sampling strategy
outlines the timing, composition of the sample, and the participant selection as well as the
appropriate approach to gaining access to the desired settings of the study. The data collection
and instrumentation section discusses the selected methods for the study as well as the rationale
for using quantitative and qualitative surveys for data collection. The ethics section describes the
responsibilities involved with informed consent, data security, confidentiality of data, and
50
anonymity of participants. The credibility and trustworthiness sections involve the qualitative
components and the strategies in the design, data collection, and analysis to ensure credibility in
every aspect of the qualitative design. The validity and reliability section relates to the
quantitative components of the study and the measures taken to increase validity and reliability.
Participating Stakeholders
While each stakeholder group involved in federal funding compliance measures are
integral to the success of accepting, delivering, and dispersing federal funding, it is important to
evaluate where the grant administrators from the school district are currently with regard to the
performance goal of the study. Therefore, the stakeholder of focus for this study was school
district grant administrators responsible for submitting the consolidated application for federal
funding to the Western State Department of Education (WSDE, pseudonym). Because this
western state is composed of 178 reporting school districts, there is a large sample size to use in
the quantitative random sampling of study. All 178 districts were contacted using public-facing
school district websites. Three positions are integral in the submission of the consolidated grant
application for federal funds. The three administrative positions are outlined and stipulated in
the state consolidated application manual and are all connected to the planning, development,
submission, and implementation of the consolidated application for federal funding. Referencing
the WSDE consolidated application manual, participants were composed of Authorized
Representatives, Application Coordinators, and Application Fiscal Managers.
Survey Sampling Criteria and Rationale
Criterion 1. The Authorized Representative. This is the individual authorized to submit
the consolidated application on behalf of the LEA to the WSDE. The Authorized Representative
51
receives all communications from WSDE federal programs office regarding program
requirements, monitoring, and ESSA school identification of schools for additional support.
Criterion 2. The Application Coordinator. This is the individual who manages the
consolidated application on behalf of the LEA. This individual is responsible for coordinating
the completion of the consolidated application for federal funding.
Criterion 3. The Application Fiscal Manager. This is the individual who is responsible
for the consolidated application budget. The Fiscal Manager is authorized to complete and
submit requests for funds and receives fiscal related communication from WSDE federal
programs office.
Survey Sampling Strategy and Rationale
Employing the convergent parallel mixed methods model involves components of both
quantitative and qualitative data collection (Creswell & Creswell, 2018). School district grant
administrators were contacted using public-facing school district websites. Using the
information from school district websites, 130 valid contacts were illictited from the search.
Fink (2013) suggests drawing from a relatively large sample to reduce errors. To ensure that the
sample is representative, over 50% completion (n=60) was expected for participation in the
survey.
The participants were representative of cross-sections of the 178 school districts in the
state and consisted of administrators involved with the planning, development, and submission of
the consolidated application for federal funds. The selection for the survey was composed of
school districts from varying sizes and individual needs from across the western state. One
representative that meets one of the three criteria listed above was selected from each
participating district for the survey questions. The participants selected were representative
52
because they were able to provide the information or understanding related to the research
purpose. The type of sampling typical for quantitative study is that of nonprobability in which
the investigator generalizes results of the study from the sample from which it was used
(Merriam & Tisdale, 2016). Stringer (2014) further qualifies that quantitative survey research
provides generalizable explanations providing the basis for predicting events and phenomena
through quantifiable data sets. This survey data linked to knowledge, motivation, and
organization markers to help determine gaps in understanding using the Clark and Estes (2008)
gap analysis framework.
Weighting helped ensure that the survey results are representative of the population by
assigning stratification to determine certain characteristics representative of the group (Fink,
2013). It was important to elicit responses that are accurate to the experiences of a broad range
of districts ranging from small (less than 1,000) to the largest district in WSDE serving 80,000
students. The survey group was composed of the contacts listed in the consolidated application
and were all found from publicly available website information on the school district website.
They were the most appropriate selection because they are the grant administrators responsible
for the planning, implementation, and submission of the federal funds for the school district.
This random assignment helped to inform the second aspect of the convergent parallel methods
study, the qualitative survey with open-ended responses.
In conjunction with the quantitative surveys delivered to school districts, WSDE staff
who oversee the submissions and review of the consolidated application were surveyed using
open-ended qualitative questions. There are twenty staff at WSDE responsible for overseeing
the submission and review of consolidated applications. All twenty staff received the qualitative
survey with open-ended questions. The qualitative questions were designed to determine what
53
organizational factors are influencing the knowledge and motivational gaps described in the
survey results from the LEA (Merriam & Tisdell, 2016). The qualitative survey focused on both
the fiscal and programmatic perspective of state agency representatives who review and oversee
review processes for the consolidated application for federal funds. The words and data
analyzed using the convergent mixed-methods approach were compared and related in the
overall findings of the study. This format for data collection was heavily reliant upon equal
sample sizes and quantitative (construct) and qualitative (triangulation) validity established
through all data collection methods in the study (Creswell & Creswell, 2018).
Maxwell (2013) qualifies that the use of purposeful selection achieves the
representativeness of a sample, captures a range of participants, and the proper selection of
participants. Maxwell (2013) provides possible goals for purposeful selection when approaching
a qualitative research design. One of the goals is to ensure that the group is representative of the
target population. As the state grant coordinators are responsible for the training of the local
school district administrators, it is important to address the knowledge, motivation, and
organizational influences from the state administrator perspective as well. Additionally,
Maxwell (2013) argues that there needs to be a range of variations to make relative comparisons
between the groups. This provides variation to the data collection to ensure the validity and
trustworthiness of the study data collection.
The selection for this study is in support of the conceptual framework because it is
representative of the various stakeholders involved in the development and submission of the
consolidated application for federal funds. The SEA grant coordinators are also aligned with the
intent of the Every Student Succeeds Act (ESSA) to provide guidance around supplemental
educational opportunities to historically underserved students to close existing achievement gaps.
54
The LEA grant administrators are responsible for implementing the programs that close
achievement gaps at the local level. The school district federal program administrators are
responsible for data reporting of targeted populations identified under ESSA, which contributes
to the statewide goal of closing achievement gaps through supplementary programming provided
with federal funding under ESSA.
Explanation for Choices
Because this study is evaluative, the coupling of quantitative and qualitative approaches
using survey data from LEA district grant administrators and qualitative survey data from WSDE
staff uncovered the knowledge and motivational factors that influence final approval of
consolidated applications for federal funding. The survey results informed the qualitative open-
ended survey questions and determine how the knowledge, motivational, and organizational
influences interact. Johnson and Christensen (2014) discuss the utility of qualitative open-ended
questions and surveys through the fundamental principle of mixed research. This principle
suggests multiple pathways of investigation to elicit data in a research study using
complementary and nonoverlapping instruments.
Quantitative Data Collection and Instrumentation
The content of the quantitative survey focused on the essential components of submitting
an application that meets final approval of the Consolidated Application for federal funding as
reviewed in chapter two. The final approval process considers both the State Education Agency
(SEA) role as well as the Local Education Agency (LEA) role in the process of reviewing and
granting approval for federal grant applications for supplemental educational funding. The
quantitative data collection processes were employed in this study to gain insights regarding the
knowledge, motivation, and organizational influences that affect final approval (compliance) of
55
applications from the LEA to the SEA for federal funds. Because of the large sample size of
school districts in the western state, a large, quantitative random sample facilitated the selection
and reach a larger group of people using survey instrumentation.
Surveys
Using the Ohio Department of Education federal programs survey as a reference and the
United States Department of Education federal programming survey, the survey consisted of 30
questions. Creswell and Creswell (2018) provide a framework for researchers to consider when
using research questions (hypotheses) to measure against variables of the study. According to
Creswell and Creswell (2018), quantitative questions consist of three basic approaches including
comparison or relation of an independent variable to dependent or descriptions of independent or
dependent variables. This survey consisted of Likert scale, multiple-choice (fixed) questions,
and short response questions that are descriptive of the motivation and knowledge influences
needed to submit a fully compliant federal grant application (Creswell & Creswell, 2018). The
majority, or 15 of the questions were closed, fixed responses aligned with the construct of
knowledge, motivation, or organizational (KMO) influence being analyzed (Clark & Estes, 2008;
Salkind, 2017). For example, the measurement of the motivation influence of self-efficacy
employed a collection of survey items to determine the construct’s meaning and its relationship
to the respondent’s understanding.
The survey items are linked to the study’s research questions and conceptual framework
as they are built around systems of trust as outlined by organizational factors as well as
attribution and self-efficacy factors that are instrumental in the sharing and receiving of feedback
loops. The survey helped to quantify the KMO factors that lie behind the process of submitting
an application for federal funds and reviewing and offering feedback if the application does not
56
meet statutory compliance. The KMO influences that affect 100% compliance and approval of
an application were examined through the delivery of the survey as it relates research question
two.
Survey procedures. After receiving approval from the Internal Review Board (IRB), an
English-language online survey was emailed to authorized representatives of school districts in
the western state. This survey was part of the convergent parallel mixed methods protocol,
which combined the quantitative and qualitative findings. Because of this design, it was not
entirely necessary to send the survey first, but the survey was delivered prior to the qualitative
survey to allow for ample time to receive responses. The submission period was from May of
2020 through July 2020. This was to provide ample time for the school district administrators to
respond to the survey as it is the end of the school year and a very busy time for their schedules.
The authorized representatives from the school district responsible for submitting federal funds
were emailed with a link to voluntary participate in the survey. Using public websites for school
districts, 178 school districts were emailed and 60 emails were not valid. Of the 130 valid
emails, it was anticipated that 50% or 60 school districts would respond to the survey by the end
of July 2020. Of the 130 school districts, 60 school representatives (grant administrators) were
anticipated to respond. The survey was anonymous; however, identifying demographic
information related to the school district was gathered to differentiate the responses provided
without compromising identity.
The survey was composed of both Likert scale items and questions assessing knowledge,
motivation, and organizational needs (Robinson & Firth, 2019). The survey was self-
administered through an online link using the SurveyMonkey platform. Salkind (2017) discusses
nominal categorical classification as a method to attribute constructs to question types.
57
Therefore, the survey was assigned nominal or categorical classifications to specific knowledge,
motivation, or organizational influences (Clark & Estes, 2008; Salkind, 2017). This survey
method served as a needs assessment to determine what grant administrators need to be
successful.
The constructs or classifications were measured against the declarative and procedural
knowledge types using the survey questions for measurement (Aquinis & Kraiger, 2009; Clark &
Estes, 2008; Krathwohl, 2002). The motivational classifications were measured using written
survey items based upon attribution and self-efficacy theory influences and organizational
influences that include quantitative survey items to determine cultural models and settings
regarding the efficacy of training initiatives offered by the SEA (Anderman & Anderman, 2006;
Bandura, 2005; Pajares, 2006; Pintrich, 2003). Items related to motivation and knowledge are
based upon existing reliable and valid surveys from the Ohio Department of Education as well as
a federal programs survey from the United States Department of Education (USDE). The
surveys are reliable because they are fully published and have undergone extensive peer review
to arrive at publications from the state and federal agencies. This is evident in the survey design,
methodology, and references. This organization is consistent with the conceptual framework of
the study as it relates to the exchange of information between both organizations at the local and
state agency level. Additionally, the survey design is complementary to the research questions as
the questions are intentionally organized to determine the possible causes of gaps in
understanding in the development and implementation of federal programs at a school district
level. Therefore, the questions were designed to determine the declarative knowledge gaps that
exist in the understanding of the grant application as well as in the development of a needs
58
assessment. Organizational items were designed explicitly for this study. When the data
analysis is complete, all copies of data will be destroyed or deleted.
Qualitative Data Collection and Instrumentation
For this mixed-method, convergent parallel evaluative study, qualitative surveys using
open-ended questions were used to validate the KMO influences at play in the grant application
review at a state educational agency (Clark & Estes, 2008; Creswell & Creswell, 2018). In an
article describing ethical data collection protocol, Glesne (2011) emphasized the importance of
ensuring that subjects understand the voluntary nature of their participation. This study assured
participants of complete confidentiality regarding information, identity, and data record
collection. Participants were asked for permission to use data produced by this study for other
institutional purposes (Merriam & Tisdell, 2016).
According to Weiss (1994), qualitative questions help to elicit the process, description,
interpretation, and substantiation of variables from the prompts in a written, open-ended survey
format. This study employed a qualitative survey with the staff of the Western State Department
of Education federal programs unit. The purpose of the qualitative survey was to collect data
around the review side of the federal grant approval cycle, with specific interest to the
knowledge, motivation, and organizational factors that influence the approval process from
WSDE (Clark & Estes, 2008). This was consistent with the conceptual framework as both the
local school district and state education agency play intricate roles in this iterative process. The
qualitative survey with open-ended questions were dispersed shortly after the dissemination of
the quantitative surveys. The state administrators responsible for reviewing submissions from
local school districts were given three weeks to complete the qualitative survey.
59
Drawing from the framework offered by Creswell and Creswell (2018), the qualitative
survey investigated the central research questions (hypotheses) of the study. Research questions
one and two are related to the extent to which WSDE is meeting its goal of 100% final approval
by the deadline for federal funding applications as well as the KMO influences that are necessary
for the LEA or school district to achieve 100% approval or compliance on the application. This
was the central question that elicited sub-questions in the qualitative survey. The open-ended
survey format was selected due to the culture and time constraints of the federal programs office.
The intent of the open-ended survey is to elicit more detailed and direct responses rather than the
conversational style of an interview format.
Qualitative Survey with Open-Ended Response
For this qualitative survey, the participants were selected using purposeful sampling. The
office of federal programs in WSDE is composed of 20 full-time grant program specialists and
coordinators that oversee the review and training protocols of the local school districts
submitting the grant applications. From the original 20 possible participants, 13 grant reviewers
were expected to volunteer to participate in the qualitative open-ended survey to provide
comprehensive qualitative information to inform this particular aspect of the research study. The
qualitative survey consisted of 11 highly-structured, standardized, open-ended questions based
upon a fixed response. Each survey began with instructions sent via email. These email
instructions included a protocol with confirmation of the participant’s informed consent,
permission to use the answers for data collection, and that participation is entirely voluntary
(Merriam & Tisdell, 2016). The questions were drawn from Patton’s six types of questions, as
well as the Straus four major categories of questions (Merriam & Tisdell, 2016; Patton, 2002;
Strauss et al., 1981.
60
The qualitative survey consisted of open-ended questions in a highly-structured format.
The questions were aligned with the variables being investigated including the KMO influences
drawn from the Clark and Estes’ (2008) gap analysis framework. The intent of the questions was
aligned with the evaluative goal of the study to determine the effectiveness of review and
training administered by WSDE. The questions were also designed to determine if WSDE
believed that it is meeting the mission of the state agency to meet 100% compliance with federal
funding of consolidated applications within three months. The questions involve conceptual
knowledge-based questions focused on training and support initiatives as well as motivational
aspects that include attribution theory with specific attention drawn to influences of a successful
application for federal funds (Merriam & Tisdell, 2016).
Credibility and Trustworthiness
Credibility and trustworthiness within this study focused on the production of data sets in
a trustworthy and reliable manner. To ensure that the findings are credible in this study,
precautions were taken to ensure that any plausible threats to credibility are addressed.
Specifically, this included approaching potential biases and threats to alternative hypotheses.
The following strategies were employed to maintain credibility and trustworthiness throughout
this dissertation study.
To maintain credibility and trustworthiness of this study, three specific strategies were
employed to ensure the data is credible and representative of the ideas expressed by the
participants in the study. The three strategies used to increase credibility and trustworthiness
include triangulation of data, multiple sources of data, and member checking (Merriam &
Tisdell, 2016). Triangulation of data employed different data sources and examined evidence
from the sources to build a substantiation of reoccurring themes. Triangulation of data also
61
reduces the risk of further biases (Maxwell, 2013). The qualitative and quantitative survey
results helped to triangulate the data sets to determine how the KMO influences contribute to the
research questions (Creswell & Creswell, 2018). By combining multiple sources of data in this
convergent mixed-methods evaluative study, the study ensured that there are meaningful
coherence of findings and interpretations of the findings and the research question that is being
investigated (Merriam & Tisdell, 2016). Finally, member checks, otherwise known as
respondent validation, were based upon solicited feedback from respondents. This needed to
occur with both the state agency and local school district to ensure the fidelity of all respondents
and participants (Merriam & Tisdell, 2016).
Validity and Reliability
To increase the validity and reliability of the results of the quantitative survey, measures
were taken to ensure that the survey items are comprehensible or that there is face validity to the
survey questions as well as validity of the instruments are tested for validity. One component
addressed in the study to ensure validity included the use of existing surveys from public-facing
data sets. The surveys were vetted through extensive research teams from the Ohio Department
of Education as well as the United States Department of Education. This process included peer
review and evidence-based practice that is supported through empirical research. The survey
questions have been adapted for this study to maintain validity and reliability of the questions.
The questions in the pre-existing surveys relating to conceptual and procedural knowledge
influences are easily adapted to the constructs measured in this study.
According to Salkind (2017), reliability is the extent to which a variable is measured
consistently. In this study, the survey instrument was measured against nominal categorical
classifications of KMO influences (Clark & Estes, 2008). The survey questions included
62
potential filter questions that allowed the respondent to proceed to an additional question or skip
to a different question if the answer did not apply to the respondent (Robinson & Firth Leonard,
2019). In this way, the respondent provided more accurate and reliable answers that were
determined opinions or attitudes and eliminate potential “I don’t know” answers. If a respondent
provided an answer that they do not know, it still provided depth and detail to what
misunderstandings the respondent may still have.
The survey design included peer-feedback before dissemination of the online survey to
maximize the validity and reliability of survey items measuring knowledge, motivation, and
organizational elements. The survey was piloted and the questions were analyzed for validity
and reliability using a peer organization of similar size and capacity. Peer review also helped to
eliminate bias as the primary researcher in a study can often interpret or understand
organizational issues from their perspective (Merriam and Tisdell, 2016). Using peer evaluation
facilitates a larger understanding of concepts using multiple perspectives. The survey was
delivered with a deadline of 30 days to complete. Monitoring using the Qualtrics functionality
ensured that the results and participants are completing the survey coupled with reminder emails
to help ensure completion rates are adequate for data collection.
Ethics
This study examines the approval rate of Local Education Agencies in submitting and
completing applications for federal funding using the explanatory sequential mixed-methods
model (Creswell, 2014). As human participants are involved in this mixed-methods approach,
the responsibility lies on the researcher to conduct an ethical study involving informed consent,
voluntary participation, confidentiality of data and participation, recording permissions, and
storing and securing data (Merriam & Tisdell, 2016). Because the study employs a mixed-
63
methods approach, it will involve a survey element to collect quantitative data and an open-
ended survey collection of qualitative data.
Glesne (2011) discusses the implications of informed consent as a vehicle for
empowerment for research participants. Informed consent occurred before random sampling of
survey data collected at the beginning of the study through survey data and in the second phase
of open-ended qualitative survey collection. Consent forms were presented to the participants at
the beginning of data collection. The author contends that this step is also necessary to ensure
that participation in the study is voluntary, the discussions and data collection are confidential,
and also to allow participants to withdraw at any time (Glesne, 2011). Participants were
provided with the qualitative survey written questions ahead of time to complete when time
allowed. Merriam and Tisdell (2016) assert that there are ethical considerations within the
collection and dissemination of data and that it be stored in a secure location where it cannot be
accessed or tampered from anyone except the researcher. The data were securely stored
throughout data collection and analysis and will be destroyed upon completion of the study.
Merriam and Tisdell (2016) discuss ethical practices as they relate directly to the
researcher’s worldview, biases, values, and ethical principles. The authors contend that rigor and
ethical practices are not ensured merely through methodology and technique, but through the
manifestation of all of the elements of the study intricately linked. Therefore, the methods and
analysis are all tied together and are also based upon the assumptions and principles that the
researcher brings to the study. As a state employee, it is my personal bias that the applications
meet the intent of the federal law to provide equitable outcomes to historically underserved
students, but also that the approval process is expeditious to ensure the funds are obligated
promptly. With this bias in mind, it is important to view the knowledge, motivation, and
64
organizational factors ethically and objectively to ensure the reliability of the data collected
(Clark & Este, 2008). This was further discussed in the validity and reliability sections.
Rubin and Rubin (2012) further illustrate the importance of the ethical responsibilities of
a researcher. According to the authors, the researcher has a responsibility to show respect for his
or her research participants by demonstrating honesty and alignment of the topic. To uphold
these values, the study disclosed its purpose to explore federal grant approval from the state to
the local school district with full transparency. Furthermore, the email invitation to district
representatives explicitly stated that the research was conducted by a graduate student at the
University of Southern California and was not conducted by the WSDE in any way. Voluntary
participation further protected respondent’s confidentiality without any pressure into
participation and to uphold the fundamental research principle to do no harm at any stage of the
study.
Limitations and Delimitations
According to Creswell and Creswell (2018), the convergent parallel mixed methods
procedure is appropriate for this research design because of the availability of data from the
primary stakeholder, the LEA, and the organization responsible for reviewing the application, the
SEA. Quantitative and qualitative data collection was combined to provide a comprehensive
exploration of the research question in this evaluative study (Creswell & Creswell, 2018). The
data collection approach is consistent with the conceptual framework as it explores the
relationships that exist between the state and local education agencies. The quantitative
measures for the study were designed in an attempt to control anticipated and unanticipated
threats to validity (Maxwell, 2013). The constructs are mapped to capture the most accurate
alignment to the KMO influences of the Clark and Estes (2008) gap analysis framework. The
65
qualitative open-ended written question format was selected due to the very close nature of the
office culture at the Western State Department of Education. Staff relationships are valued
creating collegiality that could potentially influence any interview data collection. Therefore, the
short-answer format is intended to keep the questions and respondent’s answers more objective
for data collection and analysis.
Potential challenges include issues with respondents completing the survey, ensuring
language captures what the grant administrators need, and the lack of current research that exists
for federal funding to K-12 education agencies. The survey is not required for the school district
grant administrators’ scope of work but could help to facilitate more streamlined approaches to
planning, implementation, and grant approval. The language of the survey will need to be
consistent with the content of the grant application as well as the resources and guidance that are
provided to the school districts to maintain consistency of terminology.
66
CHAPTER FOUR: RESULTS AND FINDINGS
The reauthorization of ESSA has expanded flexibility for funding and accountability
initiatives for state education agencies. As a result, districts are innovative with promising
programs and initiatives using federal funds, but the timeline in which school districts can
obligate the funds is compromised due to the approval process from the state agency (Fránquiz &
Ortiz, 2016; USDE, 2017; Weiss & McGuinn, 2016). The Elementary and Secondary Education
Act of 1965 was born from a tumultuous landscape where students from poverty, highly affected
race groups, and other national origins were not receiving equitable academic outcomes.
Federally funded programs presented a new horizon for the education of disadvantaged children
marking a new era in education reform. This is evident within the first line of the Act charging
educators “to provide all children significant opportunity to receive a fair, equitable, and high-
quality education, and to close educational achievement gaps.” With the adoption of the Every
Student Succeeds Act (ESSA) in 2015, the federal government provides school districts with the
ability to design equitably-driven initiatives under the guidance of the state agency.
While the initiatives under ESSA are intended to close achievement gaps for historically
underserved students, there is a general lack of research for approval of funds from a state
agency in a timely manner. When federal funds are not approved by the SEA to the LEA, the
funds cannot be encumbered and state or local funds have to absorb the cost (EDGAR, 2018).
The school district is not able to obligate funds for new activities which delays programming for
identified subgroups including economically disadvantaged students, students from marginalized
racial groups, children with disabilities, and English learners (ESSA, 2018). In 2018, the federal
programs unit of a western State Educational Agency granted substantial approval to 70% of the
LEA applications within a thirty-day approval deadline. The State Agency’s target is to return
67
applications by July 31st, ideally one month after the application submission
deadline. According to the Western State Department of Education, historical trend data shows
that: in the 2017-2018 and 2018-2019 school years the state was 6 months away from this target,
and in 2019-2020 the state was five months from its target (N. [Redacted], personal
communication, February 21
st
, 2019).
To understand the knowledge, motivation, and organizational barriers to 100% compliant
federal applications upon the first submission, members of both the state education agency and
local education agencies were surveyed. In total, 178 school districts were contacted to complete
a quantitative survey and 30 grant reviewers from the State Education Agency were provided
with qualitative surveys. This study was guided by the following research questions:
1. What are the knowledge, motivation, and organizational needs necessary for the LEA to
achieve 100% approval (compliance) of consolidated applications for federal funding?
2. What are the recommendations for practice to increase approval on the consolidated
application for federal funds?
Purpose
The purpose of this chapter is to explain the findings from the quantitative surveys and
qualitative surveys with open-ended response. The chapter outlines the results of the study,
including the demographic data and roles of those who participated in both the quantitative and
qualitative aspects of the study as well as the response rate. Additionally, the research questions
that guided the study will be answered using both qualitative and quantitative data.
Participating Stakeholders
The quantitative component of this study distributed surveys to 178 school districts and
the federal grant administrators at each school district site. Each school district had anywhere
68
from one to six federal grant administrators at each site. Every school district was selected in the
western state including suburban, rural, and urban districts. For the quantitative portion of the
study, a total of 48 emails were not accurate, sending a total of 130 emails out of 178. Of the
130 contacted, 61 grant administrators responded for a total of 47% participation. Of the 61 total
participants in the study, one did not include a response to any questions past the demographic
information in the survey, changing the n count for this item to 60. Grant administrators that did
not respond declined participation for many reasons including, but not limited to, lack of time
caused by the Coronavirus global pandemic, lack of time due to social distancing constraints or
other job duties, and lack of updated records on school district websites for personnel. Table 6
outlines the response rate of grant administrators from LEAs.
Table 6
Quantitative Surveys: Response Rate
Measure Invited to
Participate
Participated %
Participated
Grant Administrators 130 61 47%
The quantitative survey collected demographic responses to provide further detail into the
experience of the grant administrators submitting the consolidated applications. Of the total 61
respondents, 17% identified as male, 82% female, and 1% preferred not to say. The years of
service at the school district for each grant administrator that responded to the survey ranged
from less than one year to 30 or more years. The years of service for each grant administrator is
outlined in Table 7.
Table 7
Quantitative Survey: Years of Service
69
Measure Less
than 1
year
1-6
years
7-12
years
13-18
years
19-24
years
25-30
years
30 or
more
years
Total
Grant Administrators 4 18 12 8 12 4 3 61
% 6 30 20 13 20 6 5 100%
The qualitative aspect of this study focused on qualitative surveys with 21participating
grant reviewers from the Western State Department of Education. The grant reviewers all work
within the federal programs unit and are divided into the Culturally and Linguistically Diverse
Education office, Elementary and Secondary Education Act office, Migrant Education office,
and Competitive Grants and Awards. Of the 28 invited to participate, 21 responded yes to
answer the qualitative, open-ended questions of the survey. Before the initiation of the study,
thirteen out of the 28 participants were anticipated to respond and participate in the survey.
Instead, 21 responded for 75% participation overall. All participants from the federal programs
unit were selected because they all review the consolidated application or some aspect of it.
Table 8 provides an overview of participants in the qualitative aspect of the study.
Table 8
Qualitative Surveys: Response Rate
Measure Invited to
Participate
Participated % Participated
Grant Reviewers 28 21 75%
Grant reviewers from the WSDE were requested to provide demographic information pertaining
to the years of service at the state agency to provide further context into the experience and
expertise of the participants surveyed. Table 9 provides an overview of the years of service for
each employee surveyed in the qualitative open-ended responses.
70
Table 9
Quantitative Survey: Years of Service
Measure Less
than 1
year
1-6
years
7-12
years
Total
Grant Administrators 1 16 4 21
% 5 76 19 100%
Knowledge Results
The first component of the Clark and Estes (2008) gap analysis framework consists of
knowledge influences that impact the Western State Department of Education (WSDE) and
Local Education Agency (LEA) performance goal of meeting 100% compliance of the
consolidated application upon the first submission. The state and local education agency’s
ability to train for and submit a compliant application leads to further implications of success for
students through the funding efforts to supplement existing programs for the benefit of
historically underserved students. The funding provides an opportunity for students to receive
equitable education and is designed to close achievement gaps (ESSA, 2018). For this reason, it
is critical for both the state and local education agencies to understand the components of the
federal funds including the needs assessment and reflecting on the effectiveness of programming.
According to the literature, an individual’s ability to understand declarative knowledge leads to
further technical skills such as following procedures and ultimately, metacognitive functions
such as reflecting on the funding sources (Aguinis & Kraiger, 2009). The following findings
were examined to illustrate the gaps in understanding as they relate to planning, developing, and
implementing programs outlined in the consolidated application for federal funding by the LEA
and SEA respectively. Three knowledge gaps, declarative (factual) procedural, and
metacognitive were validated during data collection. Influences were considered invalidated if
71
less than 75% of the evidence did not validate the assumed influence. If more than 75% of the
evidence validated the influence, it was considered as a viable data source for the study. If
insufficient data existed, the gap was undetermined. Table 10 identifies assumed knowledge
influences and the summary of findings for each influence.
Table 10
Assumed Knowledge Influences, Determination, and Summary of Findings
Assumed Knowledge Influence Gap Validated, Invalidated or
Undetermined
LEA needs declarative knowledge of the
elements of the Consolidated Application
for federal funding in the western state K-
12 public education system.
LEAs need to know how to incorporate a
comprehensive needs assessment into
activity plans for Title funding.
LEAs need to know how to self-reflect on
the effectiveness of their federal
programs.
Gap validated. LEAs need further
guidance and supports for the
elements of the Consolidated
Application for federal funds.
Gap validated. LEAs need to
incorporate the needs assessment
into activity plans for Title funding.
Gap validated. LEAs need further
guidance on strategies to self-reflect
on the effectiveness of federal
programs at the local level.
Elements of the Consolidated Application for Federal Funding
The first finding that emerged through data analysis involves the LEAs need for
declarative knowledge of the components of the consolidated application. 61 district grant
administrators were asked in the survey to rate their overall understanding of the consolidated
application and LEA user manual. This item relates directly to the study’s research question,
with particular attention to the administrator’s need for declarative knowledge of the components
of the application. As suggested by Aguinis and Kraiger (2009), declarative knowledge is a
framework from which additional training with more technical elements can be developed.
Drawing from this idea, the basic understanding of the consolidated application is the foundation
72
from which other skills can be built and enhanced. The majority of answers correspond with the
agree category or 60% of respondents, followed by strongly agree 23%, disagree, 15%, and
strongly disagree, 2%. The frequency of respondent information is presented in Figure 2.
Figure 2. I am Familiar with the Consolidated Application Manual.
By providing examples of their understanding of the consolidated application and its
components, the grant reviewers surveyed at WSDE revealed that there are different levels of
understanding of the components of the consolidated application from the state perspective. The
state is responsible for training the LEA in the components of federal programming and funding
as well as the consolidated application for federal funding. The state employees were asked to
describe how well if at all, the grant reviewers understand the consolidated application and its
contents. Of the 17 respondents that answered the question, 6 responded that they somewhat
understand the application, 6 responded that they understand the components of the application,
5 indicated a thorough understanding of the application and its contents. In general, this
0
10
20
30
40
50
60
70
Strongly Disagree
=1
Disagree = 2 Agree =3 Strongly Agree =4
Q1: I am familiar with the consolidated application manual.
Strongly Disagree =1
Disagree = 2
Agree =3
Strongly Agree =4
73
response rate indicates a sentiment of understanding of the application from the state agency
perspective.
While both stakeholders involved at the SEA and LEA are aware of the components of
the consolidated application, there remains a significant gap in the target goal of reaching 100%
compliance within thirty days of the submission. Therefore, the guidance associated with the
application could also influence the final approval of submissions to the SEA from the LEA.
The guidance provided by the SEA covers the declarative knowledge components of the
application including the training manual, year at a glance planning document, and state ESSA
plan. Additionally, the review of the literature found that guidance exists for federally funded
activities through the Education Department Grant Administrative Regulations guide (EDGAR,
2018); however, there is a general lack of research around planning, implementing, and
evaluating effective programs under ESSA (2018). A state employee noted that there is a
general lack of “understanding of the use and purpose of funds” from the LEA administrators
submitting the application. This lack of understanding implied by the state employee could be
attributed to the guidance provided at the state level and understanding of the guidance from the
LEA perspective.
When responding to the survey item concerned with using the consolidated application
manual to understand the application and its contents, nearly half of the grant 60 grant
administrators (49%) responded that they disagree (42%) or strongly disagree (7%) that they use
the manual. Table 11 shows the measure of frequency and percentage for this question.
Table 11
Question 2: Frequency and Percentage of Responses
Response option Percentage of respondents
Strongly agree 8
74
Agree
Disagree
43
42
Strongly disagree 7
Based on this data, Figure 3 presents a visual of the division of attitudes between strongly
agree/agree and strongly disagree/disagree.
Figure 3: I Use the Consolidated Application Frequently to Inform Decisions.
The LEA’s responses indicate that there is a strong understanding of the consolidated
application; however, the number of submissions submitted to the SEA indicates otherwise. One
member of the SEA review team notes that “guidance documents are extremely detailed.” Some
examples of guidance documents include the ESSA State Plan, the Consolidated Application
manual, and the annual planning document or “Year at a Glance.” The identified problem of
practice for this study identifies that the target of thirty days for submission and approval of
funds is delayed by six months as outlined by the Education Department General Administrative
Guidelines (EDGAR, 2018). This delay could be due to the LEA not having access or
referencing guidance provided by the SEA. This is further supported by the LEA survey item
that references familiarity with the ESSA State Plan. Of the 60 respondents in the survey, 35%
51
49
I use the consolidated application frequently to inform decisions for
federal funds.
Strongly Agree/Agree
Strongly Disagree/ Disagree
75
responded that they strongly disagree or disagree with the statement, “I am familiar with the
ESSA State Plan.” The other 65% of respondents indicated that they are familiar with the ESSA
State Plan showing that there is a large number of LEAs in the state that are familiar with the
overall plan of the state for federal funds. The number of disagreements further validates the
need for declarative knowledge of the elements of the consolidated application for federal
funding in the western state K-12 public education system.
The qualitative surveys with 21 grant reviewers from the state revealed additional
perceived gaps in declarative components of the understanding. This is evident in the responses
to the question, “Why (or why not) do you think the LEA is capable of receiving final approval
on the first submission of the consolidated application?” Over half of the respondents or 13 out
of the 21 respondents indicated that any misunderstanding of the elements of the consolidated
application are linked to understanding the usage, purpose, and full understanding of the
application as a whole. The SEA staff reported that there is a general disconnect in “what is
needed and offered support” along with “well-written plans and fully developed budgets that
align with the planned strategies.” The grant reviewers comment on the need for the school
district applicants to have more understanding of the components and requirements of the
application. Another SEA reviewer noted that they believed LEAs do not “fully understand
requirements of Title funding…timing of when the application is due and availability of
information/data needed to complete the application.” The SEA reviewers note the declarative
gaps in understanding and potential issues with information impeding this understanding. In
statute, ESSA refers to the LEA application as the LEA plan that should contain all of the
components necessary for approval included. The components include a completed,
approximate budget, stakeholder engagement descriptions, needs assessment, and assurances that
76
all components of the funds requirements are met (ESSA, 2018). The comprehensive needs
assessment is a component of the application where all priorities intersect and will be discussed
further in the following section.
Incorporate a comprehensive needs assessment into federal funding plans
The second finding relates to what Krathwohl (2002) refers to as procedural knowledge.
This type of knowledge falls into the processes and procedures involved in the planning and
development of federally funded education initiatives. Local flexibilities were afforded through
the passage of the Every Student Succeeds Act reauthorization of the Elementary and Secondary
Education Act. These new flexibilities are for innovative activities to meet academic and
linguistic outcomes for historically underserved students (USDE, 2017; Weiss & McGuinn,
2016).
Grant reviewers from the SEA discussed general training opportunities offered by the
state agency to the LEA; however, the state agency employees noted that training topics are
generally selected by the SEA and additional training is provided if requested by the LEA.
Several grant reviewers surveyed indicated that the training is not provided to every LEA and
often the LEAs that lack the procedural knowledge necessary to develop a needs assessment and
identify strategies for funding do not attend. The data from surveys and open-ended qualitative
responses revealed that the SEA provides opportunities for training and guidance through,
regional contact, regional meetings, technical assistance requests, published guidance, email
notifications. Additionally, with the result of COVID-19 occurring during the survey timeline,
considerations for training were extended to webinars, phone calls, and online guidance and
resources. When considering the SEA’s ability to align initiatives to effective practices or
77
evidence-based approaches, a SEA reviewer described training in an open-ended survey question
saying:
We could do a better job of helping LEAs prioritize initiatives, by aligning strategies
across systems-level work. We could align the consolidated application process to
timelines and due dates for projects in which they are already engaged. We could save a
lot of time and man-hours by aligning our application and improvement systems to "talk"
to each other and use the data, narratives, strategies developed, timelines for
implementation, implementation benchmarks, and progress-monitoring across programs
and initiatives.
The SEA grant representative reflects on the need for more of a portfolio management
system. The alignment of “strategies across systems-level work” refers to the different research-
based and evidence-based strategies, building a needs assessment to match the priorities outlined
in the application, as well as the systems of grant application (identifying students at need, at-
risk, and meeting performance benchmarks). The SEA grant reviewer refers to implementation
benchmarks that can be further built out to support the LEA in creating a needs assessment that
meets school and district program needs and initiatives. This finding was validated through
additional survey data for the question item, “I am confident that my use of federal funds is
meeting the needs of my LEA, both academically and linguistically.” The 60 responses from the
LEA representatives to this survey question indicated a mean of 1.62. Additionally, this question
item yielded a standard deviation of .57. The dispersion of data from the average distance from
the mean helps to illustrate how the responses varied from the average or mean number. 33.33%
of respondents strongly agreed with this statement, 60% agreed, and 6.67% of respondents with a
disagree response. This data indicates that LEAs prioritize initiatives and do consider strategies
78
and program needs to be an effective component of federally funded initiatives. The
comprehensive needs assessment is intricately tied to academic and linguistic outcomes and is
part of the planning component embedded in the comprehensive needs assessment according to
the western state’s manual for the consolidated application.
To determine the grant administrators’ familiarity with the comprehensive needs
assessment, a survey question item asks respondents to provide a rating associated with the
statement, “I am familiar with the Comprehensive Needs Assessment.” The majority of answers
corresponded with the agree, 58%, or strongly agree, 25%, category with an aggregate of 83% of
respondents with the remaining 17% indicating disagree and zero answers for strongly disagree.
This data suggests that the grant administrators are familiar with the comprehensive needs
assessment; however, there is evidence of a gap in this influence due to the state representative’s
references to a need to enhance current guidance, supports, and systems available to the LEA to
improve the process of developing a needs assessment. By developing a needs assessment,
LEAs can further prioritize initiatives that will lead to effective and research-based programs for
students. The next influence focuses on the findings related to metacognition or reflecting on
federally funded initiatives.
Reflecting on federal funds
The third validated influence from Table 10 involves metacognitive knowledge.
Metacognition involves an individual’s understanding of their own metacognitive process
(Baker, 2006). Reflection of federal funds relates to the awareness of how one learns and
controls monitoring and implementation into desired outcomes (Mayer, 2011). At the State
Education Agency, the grant reviewers expressed different beliefs regarding the reflection of
effective programming. When describing the components of reflection in federally funded
79
initiatives, 15 of the 21 SEA employees surveyed remarked that they believed the monitoring
processes at the state level were very much focused on compliance-based training situated
towards statutory elements connected to the funding initiatives (EDGAR, 2018). This finding
was further supported by a grant reviewer discussing their perspective of reviewing consolidated
applications for federal funding. The reviewer said, “the reviews are not a check to ensure
activities/programs/services are evidence-based and supported by research.” This statement
further explores the notion that the reviewers at the state agency view the review as more of a
compliance check rather than a review of effective programming efforts. The grant reviewer
engages in self-reflection based upon their own consideration of the effectiveness of review from
the perspective of the state agency. Furthermore, this grant reviewer discusses the statutory
elements as outlined in Every Student Succeeds Act (2018) that the LEA Plan or consolidated
application for federal should funds be grounded in evidence-based practices or supported by
research. This is evidence that the reviewer engages in metacognitive processes through their
own self-reflection of the review and how it does not meet the elements outline in statute
contributing to overall outcomes beyond the individual and the review.
13 out of 21 SEA grant reviewers surveyed discussed monitoring efforts from the SEA to
LEA. One reviewer noted that “the monitoring process has been developed to move SEAs to
being compliant and having practices that are effective,” but does not further qualify the
evidence-based practices to further provide supports from the state agency. Another grant
reviewer reflected that supports for evidence-based practices from the state agency support
metacognitive reflection because there is a strong emphasis on effective practices and meeting
the intent of the funding. The grant reviewers suggest that they try to “weave in effective
practices” and make it clear that supports offered by other state members vary. Another grant
80
representative comments that “monitoring programs [are] collaborative with districts and
schools” and “the monitoring process has been developed to move SEAs to being compliant and
having practices that are effective.” The responses indicate that the grant reviewers further
engage in self-reflection of the individual’s own effectiveness as it relates to the review process.
The open-ended survey responses showed different understandings of the monitoring
efforts and the role that the state plays within the monitoring and grant review feedback.
Another example of a response from a SEA representative commented that “Most [LEAs] just
respond to comments [from SEA reviewer] or they just contain an action statement for how to
bring the cons app response into compliance but doesn’t really address best practices around
programming.” This response from the SEA reviewer shows how the feedback from the SEA to
LEA is not necessarily interpreted as coaching for improvement or programming, but simple
fixes to meet compliance such as budget or coding checks. The metacognitive aspect of this
suggests that the individual does not feel that their own efforts to review the content of the
application actually contribute to further implications or outcomes related to effective practices
from the LEA.
Furthermore, 7 grant reviewers of the 15 responses regarding monitoring suggested that
monitoring practices encourage further self-reflection into effective practice or research-based
practice using federal funds. The other 8 reviewers indicated that more support from the state
agency could be prioritized or encouraged for effective practices rather than just compliance
checks. Many of the practices referenced by the 8 respondents discuss how grant applications
are provided feedback to pull the application into compliance rather than effective practices.
Fránquiz and Ortiz (2016) discuss the shifting of power in designing and implementing programs
81
from the federal government to state and local levels. Following these findings, the state has
more opportunity to provide feedback and encourage effective practices through monitoring.
In the quantitative survey distributed to LEA grant administrators, the question directly
aligned with the metacognitive knowledge influence is, “I feel that my efforts to receive final
approval on the Consolidated Application will lead to equitable outcomes for students.” The
LEA survey responses to the question aligned with this influence are presented in Table 12.
Table 12
Efforts to Receive Final Approval Leads to Equitable Outcomes
Measurement Data
Range (Min)
Range (Max)
Mean
1
4
1.95
Standard Deviation
Variance
Count
.67
.45
60
Note. Questions without responses were eliminated from the
data collection.
Like most of the items in the survey, item Q6 is an ordinal data collection type using a Likert
scale set with four answer selections (strongly agree, agree, disagree, strongly disagree). Figure
4 presents the frequency of respondent information.
82
Figure 4. Q6 I Feel that My Efforts to Receive Final Approval will Lead to Equitable Outcomes
for Students.
While the LEA representatives provide insight from survey data that the grant application
leads to equitable outcomes for students, the SEA still provides further justification into how
support could be provided. Therefore, a gap has been identified for this knowledge influence.
This is explored through the sentiment of the SEA noting that, “I'm unsure to what extent
[WSDE] provides support or TA [Technical Assistance] around effective practices.” The
reviewer continued to note that the support and guidance seem to cover logistics including
allowable uses of funds, mechanisms of funding, and other details around compliance
requirements. To meet the organizational goal of 100% approval within thirty days, there are
further components beyond compliance and mechanisms of funding that involves initiating and
sustaining behaviors that influence the directed outcome of the stakeholder goal (Mayer, 2011;
Rueda, 2011). The next section outlines the results aligned with motivational influences that
were validated through data collection and analysis.
Motivation Results
0
10
20
30
40
50
60
Strongly
Disagree =1
Disagree = 2 Agree =3 Strongly Agree
=4
Q6: I feel that my efforts to receive final approval on the
consolidated application will lead to equitable outcomes
for students
Strongly Disagree =1
Disagree = 2
Agree =3
Strongly Agree =4
83
Qualitative, open-ended survey questions and surveys to the LEA and SEA addressed
motivation influences affecting both stakeholders’ self-efficacy and attribution theory involved
with submission and approval of the consolidated application from the LEA to the SEA.
Drawing from the literature, motivation is a driving force for sustaining goal-directed outcomes.
Motivation is also greatly influenced and constructed by stakeholder interactions (Mayer, 2011;
Pintrich, 2003; Rueda, 2011). Most grant administrators believed that their success on the
application is both attributed to their own understanding and that of the reviewer, almost equally
in each respective category. The data for these questions have been further disaggregated to
show the correlation between both the SEA and LEA attributions. Additionally, the data
suggests a strong gap in the self-efficacy influence showing that LEAs do not believe in their
own ability to submit an application that receives 100% final approval upon the first submission.
Influences were considered invalidated if less than 75% of the evidence did not validate the
assumed influence. If more than 75% of the evidence validated the influence, it was considered
as a viable data source for the study. If insufficient data existed, the gap was undetermined.
Table 13 identifies the assumed motivation influences and summary of findings for each
assumed influence.
Table 13
Assumed Motivation Influences, Determination, and Summary of Findings
Assumed Motivation Influence Gap Validated, Invalidated or
Undetermined
LEAs should feel that approval of grant
applications is based on their ability to
submit a compliant grant application
rather than the reviewer’s comments.
Grant administrators need to believe that
receiving the grants will improve
Gap validated. LEAs feel that
approval of grant is based on the
grant reviewer rather than the ability
of the grant administrator to submit
a compliant application.
Gap invalidated. Grant
administrators believe the grants will
84
academic performance/help close
achievement gaps for traditionally
underserved populations.
LEAs need to believe they are capable of
submitting an application for federal
funding and receiving approval upon the
first submission.
improve academic performance and
help close achievement gaps for
traditionally underserved
populations.
Gap validated. LEAs do not believe
they are capable of submitting an
application for federal funding and
receiving substantial approval upon
the first submission.
Attribution of Success and Failure to Own Effort
The first validated influence from Table 13 involves attribution theory. Attribution
theory involves attributing success or failure to factors or causal beliefs linked to outcomes that
include external, internal, or environmental elements (Anderman & Anderman, 2006). The LEA
grant administrators responded to both Likert-type scale responses and left additional open-
ended commentary at the end of the survey. The commentary was not required and was inserted
at the end of the survey to ask for any additional information that may have been omitted from
the survey responses. The LEAs responded to two different questions related to the success of
the application. The questions included the success of the application is dependent upon the
knowledge of the applicant or the grant reviewer surveyed versus the success of the application
dependent upon the WSDE reviewer. After additional analysis of the two item types, it is clear
that the attitudes and beliefs toward the reviewer between the two question responses reveal that
the grant administrators believe that the success on the application is attributed to both the
applicant, the LEA grant administrator, and the WSDE reviewer. Tables 14 and 15 show the
responses to each question.
Table 14
My [LEA] Success on the Application is Dependent on my
Knowledge of Federal Funds
85
Response option Percentage of respondents
Strongly agree
Agree
Disagree
26.67
61.67
10
Strongly disagree 1.67
Table 15
My Success on the Consolidated Application is Dependent
upon the SEA Reviewer
Response option Percentage of respondents
Strongly agree
Agree
Disagree
20
60
16.67
Strongly disagree 3.33
Both questions relate to the conceptual framework as well as the motivational influence
of attribution theory. The relationship between the SEA and LEA is reciprocal providing
progress monitoring towards effective development of activities that are allowable, reasonable,
and necessary as outlined by ESSA (Baker, 2006; EDGAR, 2018). In addition, perceived
causation is developed through antecedents and consequences according to precedent-setting
research by Kelley and Michela (1980). Framing the attribution of the LEA and SEA
relationship, the perception of status from a state education agency for compliance indicators
may differ from the grant applicant’s perception of how the grant application should be written.
In this case, the antecedent of knowledge of the components of the application can ultimately
affect the results or consequences contributing to approval of the LEA grant application.
Table 14 shows that 8.34% either agreed or strongly agreed with the statement related to
success on the application is dependent upon the LEA grant administrator’s knowledge of the
funds while the remaining 11.67% disagreed or strongly disagreed. Furthermore, Table 15
86
displays very similar results with 80% indicating strongly agree or agree to the statement that
success on the consolidated application is dependent upon the SEA reviewer. The remaining
20% disagreed or strongly disagreed with the statement. Figure 5 presents a visual comparison
of the two answer responses. Respondents’ attitudes for both questions are almost distributed
equally which further provides data to support that the LEA believes that success on the
application is both incumbent upon the knowledge of the applicant as well as the level of review
from the WSDE reviewer.
Figure 5. Item Q7 versus Item Q8.
In the open-ended response prompts, grant administrators commented on the level of
supports the LEA receives stating that the grant submissions are often the same from year-to-
year, but the review comments from the state agency are different. 8 out of 15 open-ended
responses reflected that the review process changes every year and it is difficult for the LEA to
submit an application for approval when the process is continually changing. One response
indicated, “The consolidated application and approval process can be very cumbersome and time
consuming.” There are a number of reasons identified through the open-ended responses
10
16.67
61.67
60
26.67
20
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Q7: Success on the application dependent
upon LEA
Q8: Success on the application dependent
upon SEA
Item Q7 versus Item Q8
Strongly Disagree =1 Disagree = 2 Agree =3 Strongly Agree =4
87
including reviewers change every year the application is submitted or issues with inter-rater
reliability. Another response noted, “One year it is approved and the next it isn’t. And this has
been going on for years. They need to either tighten up inter-rater reliability or look at previous
year’s approved activities because making us resubmit.” The grant administrator at the local
level is referring to the application provided to WSDE as the same application that was approved
in the prior year. The response continues to indicate that the grant administrator changes the
wording in the application one year only to revert back and change it again the following. These
semantical issues in language can also lead to what the other administrator referred to as
“cumbersome” applications. Finally, another open-ended response indicated that the reviews are
“very inconsistent.” This inconsistency can lead to the LEA believing that approval is attributed
to the SEA grant reviewer rather than the LEA grant applicant. This, in turn, validates a gap in
attribution that the individual LEA believes that success on the application is attributed to the
SEA grant reviewer’s ability to submit an approvable application. Instead, the grant
administrator discusses the issue of “inconsistent reviews” by the SEA.
The open-ended surveys support the notion that the success of the application can very
much depend on the reviewer from the state agency. This is discussed in open-ended survey
items with the SEA as “reviewer reliability” or also described as “interrater reliability” more than
three times in different survey responses from SEA grant reviewers. The SEA discusses this
reliability of review as an issue that could be affected in both individual and team settings. One
WSDE reviewer notes, “I believe all reviewers have some weakness in certain areas and others
simply have more experience/knowledge in certain areas than others.” These weaknesses could
potentially lead to the interrater reliability that the LEA describes as inconsistencies in reviews.
In a study of competitive grant review application for the National Institutes of Health, Pier et al.
88
(2018) discovered an inconsistency in the reviewers’ evaluation of the same application. In their
findings, the researchers concluded that this contributes to a decline in funding success rates as
well as the research providing relevance to other grant review processes.
Belief in Individual Capacity to Receive Final Approval on Applications
Self-efficacy, from the perspective of the LEA, involves the need to believe the grant
administrators are capable of submitting an application for federal funding to the SEA and
receiving final approval upon the first submission. Self- efficacy is the belief that the grant
administrator can complete the task of submitting a compliant application (Bandura, 2005;
Anderman & Anderman, 2006). Self-efficacy is determined through judgments that an individual
carries out to perform the task at hand (Pajares, 2006). The working knowledge of the
application as well as the ability to submit are functions that contribute to an overall ability to
promote higher self-efficacy. In the survey question, “I am confident that my submission of the
Consolidated Application for federal funds will receive 100% final approval upon the first
submission,” the grant administrators indicated a very low belief in their ability to receive 100%
final approval upon the first submission. 10% strongly agree, 31.67% agree, 45% disagree, and
13.33% strongly disagree. This data indicates that the majority of respondents, 58.3% believe
that they will not receive approval upon the first submission. A grant administrator commented
in the open-ended, optional response section, “based on my experience and commentary with
other authorized representatives, the Consolidated Application review comments and final
approval can differ from submission to submission, if reviewers at the State Level change each
time the application is reviewed.” While this could also be viewed as an attribution theory
influence, the grant administrator does not believe in their ability to submit an approved
application due to external factors that are beyond the applicant’s control.
89
A grant reviewer at the SEA remarked, “I honestly believe it would be extremely rare for
an LEA to get fully approved on the first submission. The SEA/reviewers should always have
some sort of feedback for the LEA for improvement.” The SEA open-ended survey responses
also indicated that this lack of self-efficacy could be due to contextual factors including the size
of the district or clarity of expectations. SEA grant reviewers indicated that the LEA lacks a
clear understanding of the declarative knowledge components of the consolidated application
(use and purpose of federal funds) that contribute to the LEA’s ability to receive approval upon
the first submission. This is supported by an LEA open-ended response comment on the survey
noting, “We sign over most of our consolidated grants to our BOCES [LEA consortium] and
then work through them for grant planning. Without them, the man-hours to finalize this grant
funding would be impossible for a small rural district like ours.” This LEA representative
comments on the scale and scope of undertaking the entire grant as a smaller, rural district.
Because the grant is so time-consuming, this further establishes that understanding the
components of the grant is essential to try to expedite the process. This grant administrator
indicates that they need to sign over the funds to a consortium in order for all of the components
and details to be addressed within the application.
The lack of self-efficacy from the LEA could be attributed to a number of factors that
will be further explored in the cultural models and settings of both stakeholder organizations.
Regardless, this finding validates the gap in self-efficacy for the LEA belief that grant
administrators are capable of meeting the organizational goal and submitting an application for
federal funding to the SEA and receiving final approval upon the first submission. Building on
the knowledge and motivational findings, the organization results consider the dynamics of
90
cultural models and settings that affect both the SEA and LEA in meeting the organizational
goal.
Organizational Results
As noted in the conceptual framework of this study, the interplay of visible and invisible
dynamic structures appear through the integration of education policy and effective practices
using federal funds (Erez & Gati, 2004; Schein, 2017). This is also evident in the interplay
dynamic between the WSDE and LEA organizations in which it supports. The culture between
the WSDE and local school districts is one forged through mutual interest and trust that
intersects at the Elementary and Secondary Education Act (2018). Table 16 displays the
assumed organization influences and summary of findings for each assumed influence.
Influences were considered invalidated if less than 75% of the evidence did not validate the
assumed influence. If more than 75% of the evidence validated the influence, it was considered
as a viable data source for the study. If insufficient data existed, the gap was undetermined.
Table 16
Assumed Organizational Influences, Determination, and Summary of Findings
Assumed Organizational Influence Gap Validated, Invalidated or
Undetermined
The LEA needs a culture of collaboration.
The organization needs a culture of trust.
The organization needs to provide time to
engage in training opportunities offered.
Gap validated. LEAs need a culture
of collaboration with the SEA to
improve application submissions
and programs offered with federal
funds.
Gap validated. Both the SEA and
LEA can improve trust relationships
for enhanced outcomes towards
organizational goals.
Gap validated. More time is needed
for the LEA and SEA to engage in
training opportunities.
91
The organization needs peer districts to
model promising practices.
Gap validated. Further modeling
with peer districts will enhance
promising practice.
Culture of Collaboration
Based on survey response feedback from the LEA, the question related to valuable
feedback and collaboration was answered favorably from local district grant administrators.
When responding to question 14, “My district relies on the State Education Agency to provide
valuable feedback for a consolidated application,” the majority of respondents answered
favorably to the question with either an agree 48.33% or strongly agree, 26.67% response. This
response accounts for 76% or over two-thirds of the respondents; however, there is still a
subsection of respondents that answered disagree (23.33%) or strongly disagree (1.67%).
Similarly, question 15, “Review comments are helpful from the state agency in order for my
school district to receive 100% final approval on the consolidated application for federal
funding.” Referring to Table 16, the first gap addressed is the LEA’s need to collaborate with
the SEA to improve application submissions and review feedback time.
Table 17 displays the respondent information for both questions.
Table 17
Q14 Compared to Q15 Response Rates.
Response Number of
responses to Q14
Number of
responses to Q15
Strongly Agree/Agree
Strongly Disagree/Disagree
45
15
55
5
Note. Based off of an n count of 60 respondents.
When the SEA reviewers were asked open-ended questions regarding collaboration
between the LEA and SEA, 18 out of 21 SEA grant reviewers commented that the districts that
92
really need to attend the training offered by the SEA. One reviewer commented that the majority
of the school districts “don’t come to the trainings.” On a similar note, another SEA grant
reviewer mentioned, “We receive great feedback from people that attend. But attendance is low
in comparison to the number of LEA applications submitted. Seems the same people attend the
trainings.” When asked about attendance concerning the 178 districts in the state, 12 out of 21
grant reviewers remarked that around half of the districts that submit application attend the
training. In addition, feedback from the 14 out of 21 state grant reviewer surveyed reveal that
this issue could be attributed to capacity, which influences both the ability (time) to engage in
training opportunities and the ability to foster collaboration between the SEA and LEA. One
SEA grant reviewer remarked:
A significant roadblock for an LEA to receive approval from reviewers on the first
submission of their application can be contributed to both LEA and [WSDE] capacity. A
portion of the smaller sized LEAs (student population) do not participate in the supports
offered [by WSDE] and are not aware of the various intricacies and details of the
consolidated application. Those LEAs often receive comments requesting sections or
components of their application to be changed.
The SEA grant reviewer provides insight into roadblocks in receiving final approval from all
sizes of districts. Depending on the size and capacity of the districts, many grant administrators
at smaller-sized districts are not able to attend meetings. This is further triangulated later in the
data analysis. LEA to SEA and The LEA grant administrators responded to this statement with
further evidence that this influence is a gap for both the SEA and LEA. The LEA grant
administrator responded that trainings are generally focused on larger school districts in the
93
metro regions of the state. The LEA grant administrator said, “Rarely is it taken into
consideration the needs and unique difficulties that small rural districts encounter or address.”
The SEA employee mentions the various issues that lead to further misunderstanding and
alignment of the consolidated application with budget and coding issues or alignment with
priorities for student achievement. Engagement in training opportunities is further discussed in
the cultural settings section. With both data sets from the LEA and SEA synthesized, it is still
determined that a gap exists for collaboration between both organizations. The next validated
gap in Table 7 refers to establishing a culture of trust.
Establishing a Culture of Trust
As mentioned in the conceptual framework, trust is forged through interactions of the two
agencies through clear communication of expectations and engagement (Agocs, 1997; Korsgaard
& Whitener, 2002). One strong indicator of trust as elicited through data collection is the trust of
the grant administrator to submit an application that will not receive revision requests from the
SEA. Through the optional, short-response section off the LEA survey, one grant administrator
commented on their experience stating, “It is my experience that the application is always sent
back for revision- even if the entire application was re-submitted for the exact same needs and
activities, there is always some paragraph that needs to be tweaked.” The LEA representative
reflects on their experience with revision requests from the LEA. While the grant administrator
notes that the LEA may submit the exact same application as the year before, there is always
something that needs to be adjusted. One LEA respondent commented in the open-ended
response of the survey saying, “Sometimes it's hard to rely on CDE with constant staff turnover
and re-explaining things to a new person.” This statement provides further evidence suggesting
that trust relationships can be strengthened between the LEA and SEA.
94
There is a feeling of mistrust that is triangulated with other findings from SEA and LEA
responses. For example, another LEA grant administrator commented, “The Consolidated
Application is somewhat repetitive in the questions. The guidance from the SEA gets better each
year although I've found that feedback is dependent upon the reviewer.” While this feedback
appears to be positive that the support and guidance are improving, the belief that the application
is repetitive represents a further disconnect of the LEA providing feedback and the SEA being
open to feedback. Another positive piece of feedback from a grant administrator suggests much
more support from the SEA. The grant administrator said, “Support from CDE is critical to
staying up to date on requirements for the Consolidated Application. We receive helpful and
timely support from the office of Federal Programs at CDE.” The LEA representative expresses
the need and desire for review comments, which indicates that the LEA representatives are open
to feedback.
The feedback provided by the SEA to LEA can be very detailed and require extensive
time to respond. 15 out of 21 respondents from the SEA commented on the extensive time to
provide feedback to the LEA. For the LEA grant administrator survey, the metric to record time
spent on resubmissions is best captured through hours spent in revisions and resubmissions of the
application. The resubmission period is after the first submission and the LEA has received
feedback on the activities applied for and whether they are reasonable, necessary, and allowable
(EDGAR, 2018). On average, the respondents noted that they spend just over 9 hours with a
mean of 9.33 and a standard deviation of 8.67. The average hours spent on the resubmission of
the consolidated application is displayed in Table 18.
Table 18
Average Hours Spent on Resubmission of Consolidated
Application
95
Measurement Data
Range (Min)
Range (Max)
Mean
0
30
9.33
Standard Deviation 8.67
Note. Questions without responses were eliminated from the
data collection.
The LEA representatives were asked to respond to a question item related to the number
of hours spent revising. The responses are listed in Figure 6.
Figure 6. Hours Spent on the Consolidated Application.
The data suggests that 49% of the district representatives spend at least 1 hour to five
hours in resubmission time. The rest of the respondents spend over 6 hours with 23% spending
an additional six to ten hours. As noted in the conceptual framework, building federally funded
programs requires collaborative partnerships forged by trust between state agencies and school
districts (Bensimon & Neuman, 1993; Hill et al., 2009). This amount of time commitment
requires confidence from the LEA grant administrators to trust that the comments are consistent,
accurate, and reflect the needs of the LEA to meet compliant practices and consider effective
0 5 10 15 20 25 30 35
1 to 5 hours
6-10 hours
11-15 hours
16-20 hours
21-25 hours
25-30 hours
over 30 hours
Q1: How many hours are spent on resubmissions of the
consolidated application?
1 to 5 hours
6-10 hours
11-15 hours
16-20 hours
21-25 hours
25-30 hours
over 30 hours
96
practices for K-12 students using federal funds. This is outlined in the purpose of ESSA as well
as the ability for federal programs to connect policy, research, and new supplemental practices
for students with high populations of low-performing targeted groups of students, (Kainz, 2019).
According to Zak (2017), a culture of trust can lead to powerful leverage on
organizational performance. This is an involved commitment for both the SEA and LEA. The
time to engage in training and iterative revision opportunities is one demonstration of the trust
and commitment between both parties involved. Given that approximately 9 hours as the
average amount of time spent on revisions of the consolidated application, it is suggested that
additional time could be invested in prioritizing initiatives between the LEA and SEA. If the
SEA and LEA engaged in more time to troubleshoot issues through building trusting
relationships, the LEA could potentially spend less time revising the application.
In the open-ended responses, the SEA grant administrators expressed sentiment in
support services offered by the state through the open-ended questions collected. SEA grant
reviewers noted that the state has worked to develop a working relationship and “rapport” with
district grant administrators and leaders. 10 out of the 21 grant reviewer respondents indicated
that technical assistance and support is always available from the SEA, but this is upon request
from the LEA. Additionally, the support was described as “high-level content that does not
always address specific needs or practices.”
Time to Engage in Training Opportunities Offered
The conceptual framework of this study establishes the importance of training through
the review of related literature. According to Clark and Estes (2008), training is a necessary
component to integrate solutions within an organization. Through qualitative survey using open-
ended response with the SEA representatives, 18 out of the 21 respondents indicated a desire to
97
provide further and support and guidance to the LEA to meet the goal of 100% approval of
consolidated applications for federal funds. Many of the reviewers also indicated that the
supports are offered upon request. This is further indicated through descriptions of
individualized support offers. The SEA discusses their own inability to provide individualized
supports to every LEA due to a lack of capacity of the reviewer at the SEA. One SEA grant
reviewer responded:
I've seen LEAs receive final approval on the first submission. I think the difference is
also skilled involvement by [Title] program personnel in understanding the weaknesses
of the system and guidance materials and being able to coach LEAs through those
discrepancies. We could also do a much better job in providing edit checks for coding--
this is one of the most frequent needs for revisions.
The response from the SEA grant reviewer discusses edit checks in the application such as
budget coding activities and other discrepancies are issues that lead to the LEA not receiving
final approval on the submission of the consolidated application. The SEA representative
identifies gaps in training from the understanding of the SEA staff and also expresses a need and
desire to “coach LEAs through discrepancies.” This statement further supports the notion that
the SEA would like to help the LEA but could spend additional time providing support.
The LEA grant administrators responded to two questions related to training
opportunities provided by the SEA. The training opportunities are referred to as “Regional
Network Meetings” and are the primary formal training opportunities offered by the SEA. The
survey question, “I believe Network Meetings help you to submit an application for 100% final
approval from the state agency” helps further elucidate how training efforts delivered by the SEA
are viewed by the LEA. Table 19 further illustrates the responses to this question.
98
Table 19
I Believe Network Meetings Help to Submit an Application
Response option Percentage of respondents
Strongly agree
Agree
Disagree
26.67
46.67
21.67
Strongly disagree 5
The response rate indicates that 73.33% of respondents believe that the Regional
Network Meetings are helpful for submission of the application; however, there remains over
one-quarter of the respondents, 26.67%, who disagree or strongly disagree with this statement.
This further illustrates that LEA representatives find the training necessary and more time could
be built in for training and support from the LEA. This question also helps provide data that
more work can be done by the SEA to make the training more meaningful for the LEA grant
administrators.
The next survey item provided to the LEA is in reference to the question, “I plan to attend
the next Regional Network Meeting.” The LEA representatives responded to the statement using
the same four-point scale answer choices as the rest of the survey. The LEA responses included
65% indicating that they strongly agree or agree and the remaining 35% indicating that they
disagree or strongly disagree. With over half of the respondents indicating that they plan to
return, it is evident that further time to engage in training is an identified need. Another SEA
grant reviewer commented on the type of support and training provided indicating:
The office also offers 1:1 [individual] supports to LEA’s but the support is only provided
to LEAs that request the support. CDE’s capacity to support all the districts assigned to
Regional Contacts also creates a roadblock for successful submission of an LEA’s
application. In prioritizing multiple areas of the work such as monitoring, application
99
development and support, organizing/facilitating internal and external supports regarding
the consolidated application, and many other things, I do not have the capacity to take the
initiative and reach out to all of the districts in my region and provide support and
coaching in successfully completing their application. I would like to review applications
submitted in the past and support LEA’s in submitting better quality application that are
implementing evidence-based practices but I do not have the capacity to do that in
addition to managing the workload of other projects and completing reviews in the time
frame allotted (July 31).
The SEA grant reviewer discusses a need to provide further supports, but a lack of time to do so
with fidelity. The issue presented by the SEA employee indicates that there are too many other
work-related responsibilities in the workload of the SEA employee. This particular insight
suggests additional time provided for SEA members to provide training rather than other
responsibilities indicated in the response.
Peer Districts to Model Promising Practice for Implementation
Ozcan (2008) discusses the importance of peer benchmarking through its ability to
provide insights into how other organizations are functioning and operating. The data elicited
from surveys by both the SEA and LEA reveals that there is a need from both organizations to
provide additional opportunities for the LEAs to engage in meaningful dialogue to learn more
about the similarities and interrelationships that exist for federally funded opportunities. The
SEA provided further insight into this through qualitative open-ended response examples
outlining the need for further engagement opportunities with districts that share similar traits and
contexts. Two responses from SEA grant reviewers indicated that the training provided to LEAs
could include further “opportunities for districts to learn from each other” as well as another
100
grant reviewer noting that “sharing effective practices” could benefit the field at large and all 178
LEAs in the state. Throughout the qualitative open-ended responses, there are 11 total mentions
of local context from the 21 respondents. Each SEA open-ended survey response reflects a
desire to provide further training using local context to drive the planning, implementation, and
monitoring of activities using federal funding sources. Another SEA employee responded to a
qualitative survey question noting:
During my tenure, there have been varying degrees of training. Currently, we are not
mindful of creating adult learning opportunities, nor collaborating with practitioners in
sharing their expertise. A great training is one in which we (SEA) talk less, and the LEAs
share more.
This response indicates a need for collaboration with practitioners in the field and sharing
expertise. This also reflects a need for peer districts to share with each other in an effort to
establish more dialogue from the stakeholder, the LEA, rather than the LEA merely
disseminating information. The LEA respondents provided feedback that related to the training
provided by the SEA to the LEA. One respondent reflected in the optional, open-ended response
of the survey:
The struggle that we have with our Consolidated application is that the regional meeting
and the support/feedback given from CDE is always geared toward large school districts.
Rarely is it taken into consideration the needs and unique difficulties that small rural
districts encounter or address.
This individual provides a reflection that the context and need are not aligned to the individual
needs of the smaller, rural districts. The respondent points out that there are very different needs
to address with different districts and their unique needs. This provides further data to support
101
the need for peer district support. Additionally, another LEA responded in the open-ended
comment section, “It would be interesting to know how [this western state’s] system compares
with what the process is in other states for districts to submit applications and receive approval
for Title programs.” While providing peer district support is an identified need established for
this cultural setting, the district representative provides further feedback requesting for peer
alignment at the state level as well. This is further validated through 13 out of 15 open-ended
LEA responses that confirm that the LEA’s context drives the submission. For example, if a
district is coming from a small rural district or larger metro one will influence decision-making
and submission of the grant. One LEA response indicated, that the training at a “regional
meeting and the support/feedback given from [WSDE] is always geared toward large school
districts.”
Synthesis
This research study assumed ten influences that may influence both the state and local
education agencies’ ability to achieve its goal of meeting 100% final approval of the
consolidated application for federal funds within one month of submission. The study may also
influence the secondary goal embedded in the intent and purpose of the Elementary and
Secondary Education Act as reauthorized by the Every Student Succeeds Act to enhance
academic and linguistic opportunities for historically underserved students. Embedded in the
submission of a 100% compliant consolidated application for federal funds is the intent of the
funds to build effective, research-based, and evidence-based interventions to improve outcomes
for historically underserved students. This study identified two areas of improvement for three
interrelated knowledge influences including declarative, procedural, and metacognitive
knowledge influences. Additionally, this study identified two motivation influences, self-
102
efficacy and attribution theory as challenges to the organization’s ability to achieve its goal.
Finally, four organizational influences, two of which cultural models, and two cultural setting
influences, were identified for areas of improvement. Table 20 shows a complete table of the
validated influences for this study.
Table 20
Assumed Knowledge, Motivational, and Organizational Influences, Determination, and
Summary of Findings
Assumed Influence Gap Validated, Invalidated or
Undetermined
Knowledge
LEA needs declarative knowledge of the
elements of the Consolidated Application
for federal funding in the western state K-
12 public education system.
LEAs need to know how to incorporate a
comprehensive needs assessment into
activity plans for Title funding.
LEAs need to know how to self-reflect on
the effectiveness of their federal
programs.
Gap validated. LEAs need further
guidance and supports for the
elements of the Consolidated
Application for federal funds.
Gap validated. LEAs need to
incorporate the needs assessment
into activity plans for Title funding.
Gap validated. LEAs need further
guidance on strategies to self-reflect
on the effectiveness of federal
programs at the local level.
Motivation
LEAs should feel that approval of grant
applications is based on their ability to
submit a compliant grant application
rather than the reviewer’s comments.
Grant administrators need to believe that
receiving the grants will improve
academic performance/help close
achievement gaps for traditionally
underserved populations.
Gap validated. LEAs feel that
approval of grant is based on the
grant reviewer rather than the ability
of the grant administrator to submit
a compliant application.
Gap invalidated. Grant
administrators believe the grants will
improve academic performance and
help close achievement gaps for
traditionally underserved
populations.
103
LEAs need to believe they are capable of
submitting an application for federal
funding and receiving approval upon the
first submission.
Gap validated. LEAs do not believe
they are capable of submitting an
application for federal funding and
receiving substantial approval upon
the first submission.
Organization
The LEA needs a culture of collaboration.
The organization needs a culture of trust.
The organization needs to provide time to
engage in training opportunities offered.
The organization needs peer districts to
model promising practices.
Gap Validated. LEAs need a culture
of collaboration with the SEA to
improve application submissions
and programs offered with federal
funds.
Gap validated. Both the SEA and
LEA can improve trust relationships
for enhanced outcomes towards
organizational goals.
Gap validated. More time is needed
for the LEA and SEA to engage in
training opportunities.
Gap validated. Further modeling
with peer districts will enhance
promising practice.
Many of the findings are interrelated to influences, especially those related to areas for
improvement. For example, procedural knowledge of the consolidated application is built upon
declarative knowledge of the components of the application. Additionally, the findings indicate
that grant administrators do not feel efficacious in submitting an application for final approval.
This identified lack of self-efficacy could be attributed to the LEA grant administrator’s
procedural and declarative knowledge related to the consolidated application as well as the
attribution that the grant administrator’s ability to submit an application for final approval is
based upon their own understanding of the components (declarative knowledge) of the
application rather than the SEA employee’s ability to review the application. The trust and
collaboration built and fostered between the state agency and local education agency further
104
contribute to issues in training and development of effective programs that are research and
evidence-based interventions for historically underserved students.
Despite some general comments regarding processes at the state level, LEA grant
administrators expressed interest in attending training and provided insight into the value of the
training at the local level. The findings conclude that both the SEA and LEA would like to
participate in further individual troubleshooting training opportunities as well as building in more
context-driven supports through peer benchmarking processes. Without building upon
declarative and procedural skills needed to submit a compliant application for federal funds,
activities in the application cannot be built and further enhanced for more effective programs in
future applications.
105
CHAPTER FIVE: SOLUTIONS AND RECOMMENDATIONS
Chapter 4 presented the results and findings from the data collected in both qualitative
open-ended surveys and quantitative surveys addressing the study’s research questions
identifying knowledge, motivation, and organizational influences affecting the organizational
goal. The organizational goal for both the state education agency (SEA) and local education
agency (LEA) involves 100% approval of the consolidated application within the thirty-day
deadline. By meeting this goal, both agencies can strive towards the organizational mission to
reach beyond compliance of the application to effective practices using federal funds. Influences
were considered undetermined if less than 75% of the evidence did not validate the assumed
influence. If more than 75% of the evidence validated the influence, it was considered as a
viable data source for the study. Nine of the influences identified in this study through the
literature review were found to have gaps contributing to the organization’s ability to meet the
goal of 100% approval of the consolidated application within a thirty-day deadline. One
influence was invalidated due to responses reflecting that grant administrators already do believe
that the grant opportunities will improve academic performance and close achievement gaps for
historically underserved students.
This chapter identifies recommendations based on current knowledge, motivation, and
organizational resources available to improve both the SEA and LEA’s ability to develop,
implement, and submit consolidated applications that meet 100% approval within the thirty-day
deadline. The recommendations discussed in this chapter are based on validated influences
assessed during data analysis. The recommendations are organized in the following sections by
knowledge, motivation, and organizational influence and follow the gap analysis approach
developed by Clark and Estes (2008). Additionally, Kirkpatrick and Kirkpatrick’s (2016) New
106
World Kirkpatrick Model framework is employed for integrated implementation and evaluation
recommendations. The evaluation and implementation plans are employed to reduce or
eliminate identified knowledge, motivation, and organizational gaps. Finally, this chapter
outlines the limitations of the study and recommendations for future research.
Recommendations for Practice to Address KMO Influences
Three knowledge gaps, one factual, one procedural, and one metacognitive, were
validated during data collection. Additionally, two motivation gaps related to attribution theory
and self-efficacy theory were validated as well as four organizational influences including two
cultural models and two cultural settings. Table 21 shows a complete summary table of the
knowledge, motivation, and organization gaps validated in the study. The findings indicate that
further collaboration between the SEA and LEA can enhance federally funded initiatives to not
only increase approved applications upon the first submission but elevate practice from
compliance to evidence-based interventions for students.
Table 21
Assumed Knowledge, Motivational, and Organizational Influences, Determination, and
Summary of Findings
Assumed Influence Gap Validated, Invalidated or
Undetermined
Knowledge
LEA needs declarative knowledge of the
elements of the Consolidated Application
for federal funding in the western state K-
12 public education system.
LEAs need to know how to incorporate a
comprehensive needs assessment into
activity plans for Title funding.
Gap validated. LEAs need further
guidance and supports for the
elements of the Consolidated
Application for federal funds.
Gap validated. LEAs need to
incorporate the needs assessment
into activity plans for Title funding.
107
LEAs need to know how to self-reflect on
the effectiveness of their federal
programs.
Gap validated. LEAs need further
guidance on strategies to self-reflect
on the effectiveness of federal
programs at the local level.
Motivation
LEAs should feel that approval of grant
applications is based on their ability to
submit a compliant grant application
rather than the reviewer’s comments.
Grant administrators need to believe that
receiving the grants will improve
academic performance/help close
achievement gaps for traditionally
underserved populations.
LEAs need to believe they are capable of
submitting an application for federal
funding and receiving approval upon the
first submission.
Gap validated. LEAs feel that
approval of grant is based on the
grant reviewer rather than the ability
of the grant administrator to submit
a compliant application.
Gap invalidated. Grant
administrators believe the grants will
improve academic performance and
help close achievement gaps for
traditionally underserved
populations.
Gap validated. LEAs do not believe
they are capable of submitting an
application for federal funding and
receiving substantial approval upon
the first submission.
Organization
The LEA needs a culture of collaboration.
The organization needs a culture of trust.
The organization needs to provide time to
engage in training opportunities offered.
The organization needs peer districts to
model promising practices.
Gap Validated. LEAs need a culture
of collaboration with the SEA to
improve application submissions
and programs offered with federal
funds.
Gap validated. Both the SEA and
LEA can improve trust relationships
for enhanced outcomes towards
organizational goals.
Gap validated. More time is needed
for the LEA and SEA to engage in
training opportunities.
Gap validated. Further modeling
with peer districts will enhance
promising practice.
108
Knowledge Recommendations
The three types of knowledge influences that will be investigated include factual,
conceptual, and metacognitive types (Krathwohl, 2002; Mayer, 2011). The following sections
address the three knowledge influences of the LEAs submission process of federal funding
applications with each influence categorized referencing the discussion above. As indicated in
Table 21, these assumed influences are validated gaps for the stakeholder achieving the goal of
100% final approval of federal applications for public school funding from the State Education
Agency (SEA). Table 21 also shows the recommendations of these validated knowledge
influences based on learning theory and theoretical principles from the literature.
The needs of the district are developed through what Kirshner et al. (2006) describe as a
schema or the framework of categorization of information and how to apply the information
elements to a given task. The author found that upon building schema, connections are
meaningfully made building on prior knowledge and grounding the background knowledge in
what is previously understood. Schneider et al. (2011) conducted a study analyzing data from
two groups of students to determine factors contributing to the connection between conceptual
and procedural knowledge. The three features of the study included whether predictive relations
between conceptual knowledge is bidirectional, the interrelations are moderated by prior
knowledge, and how both constructs contribute to procedural knowledge. The study concluded
that schema is further developed through the intentional development of prior knowledge and
conceptual knowledge to contribute to procedural flexibility. This study relates to broader issues
of education as it considers the procedural and conceptual knowledge in order to contribute to
greater competence development in a procedural task. The results of the study yielded a
determining factor that assisted performance through conceptual and prior knowledge. It was
109
found that prior declarative knowledge promotes optimal learning environments on procedural
tasks.
The open-ended qualitative surveys with the SEA representatives and surveys from LEA
grant administrators revealed knowledge gaps represented in Table 22, which contains a
complete list of knowledge influences and recommendations based on the literature reviewed.
Table 22
Summary of Knowledge Influences and Recommendations
Knowledge Influence
Knowledge
Type
Principle and
Citation
Context-Specific
Recommendation
LEA needs declarative
knowledge of the
elements and sections of
the Consolidated
Application for federal
funding in the western
state K-12 public
education system.
Declarative Information
learned
meaningfully and
connected with
prior knowledge
is stored more
quickly and
remembered more
accurately
because it is
elaborated with
prior learning
(Schraw &
McCrudden,
2006).
Break down complex
tasks involved in the
formula federal
funding application
for public education
using job aids
containing key
vocabulary related to
allocability and
allowability along
with statutory
references.
(completing the
federal application)
(compliance
indicators:
reasonable,
necessary, allowable)
LEAs need to know
how to incorporate a
comprehensive needs
assessment into activity
plans for Title funding.
Procedural Targeting training
and instruction
between the
individual’s
independent
performance level
of compliance
with federal
funding and their
level of assisted
Provide timely
feedback that links
the use of
learning strategies
related to compliant
submissions of
applications (root
cause analysis and
needs assessment.)
with improved
110
performance
promotes optimal
learning and
higher approval
rates of the
application (Scott
& Palincsar,
2006).
Providing
scaffolding and
assisted
performance in a
person’s ZPD
promotes
developmentally
appropriate
instruction (Scott
& Palincsar,
2006).
performance
(compliant
application
submissions)
Provide training that
utilizes case studies
as a scaffold to model
procedures related to
the application of
federal funds for
Local Education
Agencies.
Provide LEA
workgroup sessions
to engage in context-
specific problems of
practice (PoP).
LEAs need to know
how to self-reflect on
the effectiveness of
their federal funding
programs.
Metacognitive The use of
metacognitive
strategies
facilitate learning
(Baker, 2006).
Have LEAs identify
prior knowledge
(what they know and
what they do not
know about the
components of the
consolidated
application) before
submitting to the
state education
agency.
Provide opportunities
for LEAs to engage
in guided self-
monitoring and self-
assessment using a
checklist of
compliance indicators
(built-in collaboration
of SEA and LEA).
Provide peer LEAs
from similar contexts
to model their own
111
metacognitive
processes by talking
out loud and
assessing the
strengths and
weaknesses of the
consolidated
application
submission.
The LEA needs to develop an understanding of the components of the consolidated
application for federal funding. The results of the survey data from the LEA and SEA revealed
that the Local Education Agency needs declarative knowledge of the elements and sections of
the Consolidated Application for federal funding in the western state K-12 public education
system. A recommendation grounded in information processing theory has been selected to
close this declarative knowledge gap. Information processing is focused on learning occurrences
and practices that occur throughout the learning process. Processes include perception, attention,
storage, and recall as well as the role of recall and autonomous response (Rueda, 2011). Schraw
and McCrudden (2006) found that for information to be learned meaningfully and connected
with prior knowledge, it must be elaborated upon with prior knowledge. The basic
understanding of the federal application falls under the category of declarative knowledge as it
follows the facts and information essential for understanding the rest of the process of
completing and submitting the federal funds application for K-12 public schools. This
information also emphasizes the element of training that determines what and how a training is
conducted. This further supports the notion that the procedural knowledge associated with
completing the consolidated application increases when declarative knowledge required to
perform the skill is available and known. The recommendation then is to break down complex
112
tasks involved in the formula federal funding application for public education using job aids
containing key vocabulary related to allocability and allowability along with statutory references.
LEAs need to incorporate a comprehensive needs assessment into planning for
federal funds. Through data collection and analysis, it was found the procedural knowledge
influence is validated as a gap. Procedural knowledge influences the processes and procedures
involved in the planning, development, and implementation of a state or local plan for federally
funded education initiatives. A recommendation rooted in sociocultural theory has been selected
to close this procedural knowledge gap. Drawing from Scott and Palincsar (2006), targeting
training and instruction between the individual’s independence performance level and their level
of assisted performance promotes optimal learning. In this context, the targeted training relates
to compliance with federal funding and the applicant’s knowledge of compliance indicators for
approval rates of federally funded grant applications. This would suggest that providing learners
with timely feedback that links learning strategies leads to compliant submissions of
applications. The recommendation is to provide feedback that links the use of learning strategies
related to compliant submissions of application with improved performance including root cause
analysis and comprehensive needs assessment. The compliant submissions can be viewed as
“optimal learning” or the level of assisted performance as recommended by Scott and Palincsar
(2006). An example of this feedback would link the trainer with the applicant to provide
individualized feedback to the LEA analyzing the process used to approach the needs assessment
and its links to the components of the application.
The goal of training from the SEA to LEA is to help develop the procedural knowledge
necessary to incorporate activities identified through the needs assessment. This recommendation
is grounded in sociocultural theory, which implies the theory of assisted performance. In
113
precedent-setting research, Tharp and Gallimore (1989) explained assisted performance to
represent when the learner is able to perform a task with assistance from a more knowledgeable
other. In this context, the zone of proximal development was adopted by Vygotzky (1978),
which provides a space where a more knowledgeable peer or expert provides assisted
performance offered to facilitate further internalization and engagement of a recursive loop of
learning and demonstrating a task. Therefore, the mere instruction of a procedural task is not
sufficient to help develop the skill necessary to perform the task and feedback from the more
knowledgeable other, either a peer LEA or SEA representative is necessary to facilitate the
learning to understand the components of the consolidated application.
The LEA needs to reflect on the effectiveness of federally funded programs. The
third knowledge influence relates to the school district administrator’s need to know how to self-
reflect on the effectiveness of their federal programming. A recommendation embedded in
information processing theory has been selected to close this metacognitive knowledge gap.
Baker et al. (2015) suggest that the use of metacognitive strategies facilitates learning. The
author suggests that providing learners with peer support to model processes would support
learning and metacognitive processing. The recommendations for the metacognitive knowledge
influence include having LEAs identify prior knowledge of the components of the consolidated
application, providing opportunities for LEAs to engage in guided self-monitoring and
assessment using a checklist of indicators (collaboratively constructed by the SEA and LEA),
and providing peer LEAs from similar contexts to model their own metacognitive processes by
talking out loud and assessing strengths and weaknesses of consolidated application examples.
An example of this would be to provide online learning environments with breakout groups
114
dedicated to learning cohorts with similar peer school districts to discuss funding initiatives and
their impact on student learning.
De Backer et al. (2012) explored potential findings for reciprocal peer tutoring to
promote university students’ metacognitive knowledge and the use of metacognitive regulation
skills. The study was conducted in a naturalistic setting involving 67 students tutoring each other
during a complete semester. The results revealed significant and frequent use of metacognitive
regulation skills among the adult learners in a university setting, especially during the
orientation, monitoring, and evaluative phases. According to the researchers, modeling
processes, as well as pre, formative, and post assessment, helps to enhance understanding of a
task as well as the transmission of knowledge and concepts. The findings support the research of
Baker et al. (2015) assertion that metacognitive strategies facilitate learning. The researcher also
posits that metacognitive tasks are affected by the participants' understanding of their own
cognitive processes.
Motivation Recommendations
Two types of motivation influences were evaluated in this study, attribution theory and
self-efficacy. Table 2 represents the complete list of motivation influences. The validation was
based on the most frequently mentioned motivation influences in achieving the stakeholder’s
goal of 100% compliant applications upon the first submission for federally funded applications.
Clark and Estes (2008) discuss motivation, the second component to examining
performance issues and to conduct a gap analysis. For this investigation, motivation is examined
within the Local Education Agency’s ability to submit an approved application for federal
funding for K-12 public education (Mayer, 2011). The two theories or constructs that are
relevant to the performance goal of an LEA reaching final approval on federal funding
115
applications are attribution theory and self-efficacy theory. Attribution theory is related to
internal and external factors that affect task completion whereas self-efficacy is the individual’s
belief that he or she can complete the task (Bandura, 2005; Weiner, 2010). Table 23 outlines
context-specific recommendations related to attribution theory, or success to one’s own effort,
and the belief that the applicant is capable of submitting an application for federal funding and
receiving approval upon the first submission.
Table 23
Summary of Motivation Influences and Recommendations
Motivation Influence Motivation Type Principle and
Citation
Context-Specific
Recommendation
LEAs should feel that
approval of grant
applications is based
on their ability to
submit a compliant
grant application
rather than the
reviewer’s comments.
Attribution Provide feedback
that stresses the
process of
learning,
including the
importance of
effort, strategies,
and potential self-
control of
learning. (Weiner,
2010)
Provide feedback
that stresses the
nature of learning,
including
importance of
effort, strategies,
and
potential self-
control of learning
(Pintrich, 2003).
Provide accurate
feedback that
identifies the skills
or knowledge the
individual needs
for the components
of the consolidated
application, along
with
communication that
skills and
knowledge can be
learned through
modeling and
training. This is
followed by the
116
LEAs should attribute their success and failure to their own efforts. The results from
quantitative LEA surveys and open-ended, qualitative SEA surveys revealed that there is a gap in
the LEA attributing their application success to their own ability. Therefore, a gap was identified
that LEAs should feel that approval of grant applications is based on their own ability to submit a
compliant grant application rather than based upon the reviewer’s comments. A
recommendation drawn from attribution theory has been selected to close this motivation gap.
Weiner (2010) found that providing feedback that stresses the process of learning including the
importance of effort, strategies, and self-control of learning helps to facilitate attribution of
success to the learner. This would suggest that providing learners with accurate feedback using a
teaching of these
skills and
knowledge
necessary to submit
a compliant
application for
federal funding.
LEAs need to believe
they are capable of
submitting an
application for federal
funding and receiving
approval upon the
first.
Self-efficacy Self-efficacy is
increased as
individuals see
themselves
succeed in a task
(Bandura, 1997).
High self-efficacy
can positively
influence
motivation
(Pajares, 1997).
Feedback and
modeling
increases self-
efficacy (Pajares,
1997).
Feedback and
modeling are
provided from the
SEA to the LEA in
order to break
down the task into
manageable,
benchmarked
components and
allow for the grant
administrator to see
success on the first
submission of the
consolidated
application.
117
checklist or rubric of the review process. The recommendation then is for the SEA to provide
feedback that stresses the nature of learning, including the importance of effort, strategies, and
potential self-control of learning (Pintrich, 2003). This recommendation would further
reemphasize an emphasis on attribution retraining by providing accurate feedback that identifies
the skills or knowledge the individual lacks in the components of the consolidated application for
federal funding. Along with this, the communication that skills and knowledge can be learned
through further modeling and training. After feedback and modeling have occurred, the SEA can
provide further training of the skills and knowledge necessary to submit a compliant application
for federal funding.
According to Weiner (2010), attribution theory occurs when a learner attributes success
in a task to internal factors related to ability or effort and external factors such as the task
difficulty or luck. The authors explain that individuals attribute failure to succeed at a task could
be attributed to a lack of ability, lack of effort, or poor instruction. Korn et al. (2016) discussed
performance feedback processing as predicted by attribution theory. The study points to the
relevance of attribution theory for feedback processing in decision-making. Drawing from the
study findings, participants' behavior in a task was linked most widely to measure of attribution
style and that positive and negative performance feedback influences the evaluation of task-
related stimuli. This study further enhances the notion that feedback greatly influences the
learning and attribution process (Weiner, 2010).
LEAs need to believe in their capacity to receive final approval from the SEA.
Through data collection and analysis, a second motivation gap was identified that LEAs need to
believe they are capable of submitting an application for federal funding and receiving
substantial approval upon the first submission. A recommendation rooted in self-efficacy has
118
been selected to close this motivation gap. Bandura (1997) found that self-efficacy is increased
as individuals succeed in a task. Additionally, Yusef (2011) noted that high self-efficacy can
positively influence motivation as well as feedback and modeling increases self-efficacy. The
recommendation, therefore, is for the SEA to provide feedback and modeling to the LEA to
break down the task into manageable, benchmarked components and allow for the grant
administrator to see success on the first submission of the consolidated application. This
recommendation would suggest that providing grant administrators with collaborative processes
including building rubrics and checklists to assist grant administrators in the process of
submission and revisions of consolidated applications for federal programs will help the LEA
believe that they are capable of submitting an application for federal funding and receiving
substantial approval upon the first submission.
In a study conducted by Shen et al. (2013), self-efficacy was researched as a key
component in successful online learning. The researchers focused on modeling and feedback as
foundational elements to developing self-efficacy. These findings contribute to the notion that
self-efficacy is a factor towards successful task completion and modeling and feedback are
avenues to help facilitate and develop self-efficacy in a learner. From the two sample groups,
one group demonstrates greater self-efficacy and mastery of skill when provided feedback and
models of learning. According to the literature, it is recommended to provide feedback and
modeling from the SEA to the LEA in manageable, benchmarked components will allow for the
grant administrator to develop self-efficacy and lead to success on the submission of the
consolidated application.
119
Organization Recommendations
According to the literature, researchers have described culture in an organizational setting
as the culmination of beliefs, initiatives, viewpoints, sentiments, and learned procedures. The
accumulated shared learning of a group culminates in the structures, principles, and values that
are contextually related to the environment in which the learning occurs (Berger, 2014; Kezar,
2001; Schein, 2017). When knowledge and motivation influences have been ruled out,
organizational influences can be the source of barriers within an organizational structure (Clark
& Estes, 2008; Schein, 2017). Rueda (2011) asserts that organizational factors contribute to the
gap analysis framework by diagnosing the goals of an organization and evaluating outcomes
related to the established goals.
For this study, organizational factors will be investigated in terms of Gallimore and
Goldenberg’s (2001) framework of cultural models and settings. According to the authors,
cultural models refer to the shared understanding of conceptual schemas and practices that
contribute to greater global goals within an organization. Cultural models are generally invisible
and contribute to trust in an organization. The second is the cultural setting which is a visible
and concrete manifestation occurring within the social environment in which work is conducted.
The cultural setting is often tied to performance goals and feedback (Gallimore & Goldberg,
2001; Schein, 2017). For this study, cultural models are investigated through the lens of cultures
of collaboration and trust, whereas the cultural settings are situated in training opportunities.
Table 24 outlines a summary of the organizational influences and recommendations.
120
Table 24
Summary of Organization Influences and Recommendations
Organization Influence
Principle and
Citation
Context-Specific
Recommendation
The organization needs a culture
of collaboration (Cultural
Model)
Organizational
performance
increases when
individuals
communicate
constantly and
candidly to others
about plans and
processes (Clark &
Estes, 2008)
Look for ways to work
your
message into already-
existing forms
of communication, but
then also
look for new ways
(including live-time
chat, listserv, and office
hours) to collaborate as
people may
have gotten used to
(and now
ignore) the regular
communication
vehicles.
The organization needs a culture
of trust (Cultural Model)
Organizations with
high levels of cultural
trust tend to produce
high-quality products
and services because
they can
recruit and retain
highly motivated
employees.
Set collaboratively
designed goals between
the SEA and LEA and
allow the organizations
to determine how to
reach them;
demonstrate confidence
in each organization’s
ability to succeed.
The organization needs to
provide time to engage in
training opportunities offered
(Cultural Setting)
Effective change
efforts ensure that
everyone has the
resources (equipment,
personnel, time, etc.)
needed to do their job
and that if there are
resource shortages,
then resources are
aligned with
organizational
Provide frequent
technical assistance and
training opportunities
with SEA reviewers
and LEA grant
administrators to
establish goals and time
frames for submission
of the consolidated
application.
121
priorities (Clark and
Estes, 2008).
The organization needs peer
districts to model promising
practices (Cultural Setting).
Organizational
culture is
created through
shared
experience, shared
learning
and stability of
membership.
It is something that
has been
learned. It cannot be
imposed
(Schein, 2017).
Make connections with
partners (in peer
districts) who can
extend the
organization’s
reach, enhance its
offerings, or energize
its practices.
The organization needs a culture of collaboration. Through data collection and
analysis, a gap was validated that LEA and SEA organizations need a culture of collaboration.
This organizational influence is part of Gallimore and Goldenberg’s (2001) cultural model
framework where shared understandings contribute to the greater goals and mission of the
organization. A recommendation situated in organizational accountability theory has been
selected to close this organizational gap. Clark and Estes (2008) found that organizational
performance increases when individuals communicate constantly and candidly to others about
plans and processes. The recommendation then is to provide new and innovative ways to work
messaging into already-existing forms of communication. In addition to current modes of
communication, it is suggested to find new forms of communication such as office hours, listserv
management software, and opening office hours to form new connection points beyond the
existing communication models.
Bordean (2009) discusses the importance of organizational communication as not just an
invention in management procedures, but a basic need. According to the researcher,
122
communication improves the working environment, creates alignment with the organization’s
objectives, and communication can be a source of increased productivity. The researcher found
that information is helpful, but involving employees and stakeholders in the process of
communication generates support, passion, a sense of community and leads to a more honest
organizational culture. Therefore, finding new modes of communication that involve employee
and stakeholder collaboration will lead to higher organizational performance (Bordean, 2009;
Clark & Estes, 2008).
The organization needs a culture of trust. The data revealed that both organizations
need a culture of trust at both the state and local levels of education agencies. A
recommendation rooted in organizational accountability has been selected to close this
organizational gap. Colquitt et al. (2007) note that organizations with high levels of cultural trust
tend to produce high-quality products and services. This would suggest that employees that
demonstrate trust will more likely take their work seriously, take risks, and do their work
correctly. The author also suggests that high levels of cultural trust lead to more alignment with
organizational goals, mission, and vision. Therefore, the recommendation is to set
collaboratively designed goals between the SEA and LEA. Collaboratively designed goals can
be co-created and designed in online platforms such as shared documents as well as formal
meetings including the Regional Network Meeting offered by the Federal Programs Unit of the
state education agency. Additionally, the SEA and LEA would determine how to reach the
collaboratively created goals while demonstrating confidence in each organization’s ability to
succeed. This would require further communication, deliberate and intentional design of mission
and vision of the desired outcomes for both entities.
123
Organizations that operate under a culture of trust are more likely to succeed in shared
goals. Employees engaged in a culture of trust enjoy and perform tasks correctly, make
decisions, take risks, innovate, and embrace the organization's mission and values (Colquitt,
Scott & LePine, 2007 as cited in Starnes, Truhon & McCarthy, 2010). Further research suggests
that communication is instrumental in trusting relationships. Nordin (2013) explored perceptions
of organizational leaders regarding organizational change in a case study of 25 participants.
Participants answered questions related to meeting organizational goals at a higher institution.
The qualitative open-ended survey responses revealed that pertinent information can be shared
in-person as well as electronically. Additionally, the study showed that effective communication
of organizational mission and values allowed employees to increase their effectiveness towards
organizational goals. Through creating multiple channels of collaborative decision making
towards organizational goals, confidence and trust will be built by all participating parties.
The organization needs time to engage in training opportunities offered. Data
collection and analysis revealed that the organization needs to provide time to engage in training
opportunities offered. This is an example of what Gallimore and Goldenberg (2001) refer to as
cultural settings that relate to the performance goals of an organization. A recommendation
grounded in organizational leadership theory has been selected to close this organizational gap.
Clark and Estes (2008) found that effective change efforts ensure that everyone has the resources
(time, training, equipment, etc.) needed to do their job and that if there are resource shortages,
then resources are aligned with organizational goals. This would suggest that providing further
time to explore the alignment of goals between the SEA and LEA organizations would support
further understanding of organizational goals and priorities. The recommendation then is to
provide frequent technical assistance and training opportunities with SEA reviewers and LEA
124
grant administrators to establish goals and time frames for the submission of the consolidated
application for federal funds.
Proper resources including training, time, personnel, and equipment needed must be
addressed by the organization for effective change efforts to occur (Clark & Estes, 2008). In a
meta-analysis of training and organizational performance of 119 primary studies, Garavan et al.
(2020) examined the moderating effects of the quantity of training time, and organizational
context factors in the relationship between time, training, and organizational performance. The
authors' findings revealed that training is positively and directly related to organizational
performance. The findings confirm that investment in training is directly associated with
increased organizational performance reporting a stronger relationship than previous meta-
analyses performed. Additionally, increasing training over time plays an increasingly important
role in how organizations respond to changing environments. Therefore, providing additional
training time to meet organizational priorities is critical for organizational performance (Clark &
Estes, 2008; Garavan et al. 2020).
LEAs need peer districts to model promising practice for implementation. Through
data collection and analysis, a gap was validated that the LEA and SEA organizations need peer
districts to model promising practices. A recommendation rooted in organizational leadership
theory has been selected to close this organizational gap. Schein (2017) noted that
organizational culture is created through shared experience, shared learning, and stability of
membership. This would suggest that the LEAs partner with peer school districts or consortiums
to foster new learning between both organizations. The recommendation then is to make
connections with partner school districts or LEAs who can extend the organization’s reach,
enhance its offerings, or energize practices.
125
Making connections with peer organizations creates a shared learning experience to
enhance organizational practices (Schein, 2017). García-Morales et al. (2006) used a global
model to empirically analyze how the personal and professional development of peer educators
facilitated the creation of shared vision, values, and interrelated strategic factors leading to
organizational performance. The researchers discovered through a national survey of educators
in Spain how the dissemination of learning practices and sharing of knowledge facilitates vision
and team learning. Finally, the researchers concluded that organizational learning is a key
component in the generation of organizational performance. This further enforces the notion of
peer-designed learning to facilitate organizational performance and shared vision and values.
Integrated Implementation and Evaluation Plan
The concepts and strategies involved in this integrated implementation and evaluation
plan are driven by the New World Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016). This
plan calls for recommendations to be rooted in the context of the organization, supported by
organizational drivers, and monitored for successful implementation. The New World
Kirkpatrick Model recommends a three-phased approach to implementing solutions. The three
phases involve planning, execution, and the demonstration of value (Kirkpatrick & Kirkpatrick,
2016). The New World Kirkpatrick Model is organized into four components that work as an
iterative cycle starting with the fourth level to be addressed in the first phase of the program.
The authors recommend reversing the order to keep the focus on the program outcome that is
accomplished through training and intentional development by the researcher and practitioner.
Level 4 involves the results or leading indicators and desired outcomes. Level 3 focuses on
behavior or monitoring and reinforcement of behaviors, level 2 involves learning of knowledge,
126
skills, attitudes, and level 1 is the reaction or engagement, relevance and customer satisfaction
(Kirkpatrick & Kirkpatrick, 2016).
Organizational Purpose, Need, and Expectations
The Western State Department of Education federal programs office serves as a steward
of federal funds offered through the Every Student Succeeds Act (ESSA). The goal of the
funding is to improve education programs and services and to close education achievement gaps
at the state and local level. The purpose of the federal programs office is to interpret program
requirements outlined in ESSA and inform local districts of program requirements to ensure the
purpose of the policy is being met through implementation. The state provides guidance through
the Consolidated Application manual to school districts to help navigate statutory expectations
and compliance measures. The office also ensures that federal funds are being utilized to
implement high impact programs and services that are evidence-based in achieving academic
outcomes for targeted student populations. The targeted populations include; low-income
students, English learners, children with disabilities, children and youth in foster care, migratory
children, youth experiencing homelessness, neglected, delinquent, and immigrant students.
The goal of WSDE federal programs unit as it relates to the problem of practice is for
100% of districts to show final approval on the consolidated application for federal funding
within the 30-day approval cycle by July 31
st
, 2022. In conjunction with this goal and the intent
of ESSA legislation, WSDE seeks to enable LEAs to use federal funding to improve the
academic and linguistic outcomes for historically underserved student populations. According to
the education commissioner of the WSDE, the state agency seeks to provide “equity and
opportunity for every student, every step of the way” (K. [Redacted], personal communication,
October 21
st
, 2019).
127
Level 4: Results and Leading Indicators
According to Kirkpatrick and Kirkpatrick (2016), level 4 evaluation is based around
measures of progress using leading indicators as defined by Kirkpatrick as short-term
observations and measurements that suggest critical behaviors are on track towards
organizational goals. Rather than looking at meeting goals in binary terms of yes or no, the
benchmarks provide internal and external indicators towards progress. The metrics used towards
leading indicators of success can come in multiple modes, both observational and data to
determine the degree to which targeted program outcomes occur and contribute to the
organization’s goals (Kirkpatrick & Kirkpatrick, 2016). Table 25 illustrates the proposed Level
4: Results and Leading Indicators in the form of outcomes, metrics, and methods for both
external and internal outcomes for WSDE and the LEAs it serves. If the internal outcomes are
met as expected as a result of training and organizational support for both the LEA federal grant
administrators and the WSDE reviewers on the job, then the external outcomes should follow.
Table 25
Outcomes, Metrics, and Methods for External and Internal Outcomes
Outcome Metric(s) Method(s)
External Outcomes
1. Increased equitable
educational outcomes for
historically underserved
students
1. Activities and interventions
purchased through supplemental
funding streams contributing to
equitable student outcomes.
1. Solicit data of completed
applications as well as reporting
of increased achievement on
school and district performance
data frameworks.
2. Improved relationships
between the state agency
and school districts.
2a. Number of complaints
submitted to the state from
local school districts
2a. Solicit quarterly data from
WSDE and other state agencies.
2b. Length of time for the
consolidated application to be
reviewed by SEA
2b. Collect data of reviewed and
completed applications from
WSDE.
128
3. Increased streamlining
of grant approvals from
the state agency in a 30-
day period.
3. Track the number of
submissions from LEAs and
approval of applications from
SEA.
3. Track the length of time it
takes to grant final approvals for
all applications in the current
fiscal year against the fiscal
year.
Internal Outcomes
1. Decreased time to
review consolidated
application
1. Number of days taken to
review state consolidated
applications
1. Compare review time from 6
months to 30 days or a rolling
deadline throughout the year.
2. Increased accuracy of
review by WSDE grant
review team.
2. Track the number of errors
submitted to SEA from LEA
2. Errors and review comments
submitted back to SEA from
LEA
2. Increased employee
satisfaction
3. Reviewers have the ability to
provide support beyond
compliance indicators.
3. Compare annual WSDE
survey as well as TLCC state
survey of school/district
personnel
Level 3: Behavior
Critical behaviors. The New World Kirkpatrick Model refers to level 3 indicators that
measure the degree participants apply what they learned during training when they return to the
job (Kirkpatrick & Kirkpatrick, 2016). In the context of this study, this involves three critical
behaviors that both the grant administrators at the LEA and grant reviewers from the SEA need
to be successful. The first behavior is streamlining guidance to facilitate understanding of the
components of the consolidated application for both the SEA and LEA. The second critical
behavior is to identify misunderstandings in the consolidated application in order to streamline
grant review timelines from the SEA to LEA. The specific metrics, methods, and timing for
each of these outcome behaviors appear in Table 26.
Table 26
Critical Behaviors, Metrics, Methods, and Timing for Evaluation
Critical Behavior Metric(s)
Method(s)
Timing
129
1. Streamline
guidance around
the elements and
sections of the
Consolidated
Application
Number of submitted
applications that meet
100% compliance on
the first submission.
1a. Regional Contacts
and Consolidated
Application review
leads track application
submissions and
approval rates.
1a. Monthly review
of timelines
associated with
annual planning
document (Year at a
Glance) between
LEA and SEA
1b. Regional contact
reviews application with
LEA to determine
compliance prior to
submission.
1b. Regional
contact reports to
the ESEA program
team for submission
status, weekly
within the first 30
days of submission.
2. Identify
misunderstandings
in the consolidated
application
Identify the number of
applications that
receive comments for
revisions.
Review plans with LEA
throughout the year.
Monthly and annual
check-in by fiscal
year
3. Shifting state
agency role from
compliance officer
to support agency.
Collecting % of
successful outcomes
for targeted student
populations from LEA
to understand
performance
benchmarks for
students using federal
funds.
3a. Have a strong
understanding of % of
students that are meeting
or exceeding progress
benchmarks f to elevate
practices using federal
funds.
Monthly
3b. Federal programs
(Regional Contacts)
provide support around
effective uses of federal
funding sources in
addition to compliance
checks.
Quarterly
Required drivers. The concept of required drivers ensures that behaviors are occurring
and how training is influencing behaviors. According to Kirkpatrick and Kirkpatrick (2016),
required drivers are processes and systems that reinforce, monitor, encourage, and reward
performance of critical behaviors on the job. Required drivers provide additional levels of
130
support and accountability through reinforcement of the implementation strategies, monitoring of
behaviors and training, and encouragement of successes (Kirkpatrick & Kirkpatrick, 2016).
Based on data collected, it is highly probable that school district grant administrators do not have
the declarative and procedural knowledge necessary to successfully submit a consolidated
application for approval on the first submission. Additionally, the LEA and SEA do not have the
motivation and organizational support to increase the knowledge of the consolidated application
without collaborative effort by both agencies. Therefore, the following required drivers will be
used to support district grant administrators using job aids, monthly and weekly check-ins, peer
modeling, and performance incentives for the district. Table 27 identifies the recommended
drivers to support critical behaviors of managers and the timing of each driver.
Table 27
Required Drivers to Support Critical Behaviors
Method(s) Timing
Critical Behaviors
Supported
1, 2, 3 Etc.
Reinforcing
Regional Contacts in SEA will break down complex
tasks involved in the consolidated application using
job aids and formative benchmark assessments.
Ongoing 1
LEA and SEA check-in time to determine strategic
plan and theory of action as well as provide timely
feedback linking the use of learning strategies for
compliant and effective practices using evidence-
based interventions.
Ongoing 2, 3
LEA and SEA collect feedback from all LEAs in the
state to determine needs for the consolidated
application. SEA will provide training utilizing case
studies from feedback.
Monthly 1,2,3
Encouraging
131
LEAs and SEA collaborate to determine metrics that
are leading towards equitable and promising
outcomes for historically underserved students using
federal funds.
Quarterly 3
LEA and SEA collaborate to build feedback tools
stressing the nature of learning and importance of
feedback in manageable, benchmarked components.
Ongoing 1,2,3
Rewarding
LEAs with promising practice are recognized at state
and national conferences by showing effective
practices of ESEA programs beyond compliance
indicators.
Quarterly 3
Feedback and modeling from SEA and LEA lead to a
streamlined approval rate for the consolidated
application.
Annually 1,2
Monitoring
SEA office will keep an “early warning system”
tracker to enable the state agency to determine risk
factors for submission prior to the submission
deadline.
Ongoing 2
Collaboratively designed goals and benchmarks lead
to School and District performance metrics that
demonstrate confidence in both organization’s ability
to succeed.
Ongoing 2,3
Frequent technical assistance with peer districts leads
to the monitoring of effective practices using federal
funds.
Ongoing 2,3
Organizational support. Based on the data collected, it is highly probable that the
organization needs to provide managers with the necessary resources to contribute to targeted
program outcomes and the highest possible results. As outlined by Kirkpatrick and Kirkpatrick
(2016), required drivers are processes and systems that reinforce, monitor, encourage, and
132
reward performance of critical behaviors on the job. The following recommendations are
outlined in a four-step, iterative process.
First, the organization should streamline guidance of the elements of the Consolidated
Application to ensure that both the SEA and LEA are knowledgeable in the components of the
application. Further benchmarking is required to understand elements of misunderstanding from
the LEA to SEA as well as more support for developing root cause and comprehensive needs
assessments. Second, through formative assessment of all LEAs in the state, the SEA can
prioritize and identify the number of applicants with issues with completion of the application
before the submission deadline. Without further benchmarking, supports cannot be provided by
the SEA further creating a gap in understanding from compliance to more effective practices for
historically underserved students. An example of this involves using the consolidated
application user manual to create a key terms list along with a “year-at-a-glance” document
including key components of the application and its application. This document also includes
timelines associated with the deadlines and key terminology of the consolidated application for
federal funding. Third, the SEA and LEA must ensure that support efforts lead to equitable
outcomes as outlined in the Every Student Succeeds Act. The application from the LEA should
not only meet federal statutory requirements, it should also show promising practices developed
in collaboration with multiple stakeholders including community, District Accountability
Committees, board members, School Accountability Committees, and the state agency.
The training from the SEA to LEA creates a space for the local school district to learn the
procedural knowledge necessary to accurately incorporate activities to address the underlying
needs determined through the needs assessment. While this is considered information or
declarative knowledge, further acquisition of experiential knowledge would be necessary to
133
capture the declarative, procedural, and metacognitive aspects of measuring programs from
compliance to effective practice. Further recommendations would include providing training
that utilizes case studies to model procedures related to the application of federal funds for
LEAs. Additionally, providing LEA workgroup sessions to engage in context-specific problems
of practice would enhance the learning from information to experiential practice. Districts can
be grouped in cohorts according to district portfolios discussed later in this chapter.
Level 2: Learning
Learning goals. Following completion of the recommended solutions, most notably the
Compliance to Effective Practices (C2E) Bootcamp and individual technical assistance support
with benchmarks of progress, the stakeholder will be able to:
1. Recognize the statutory requirements of each Title program with 100% accuracy
(D).
2. Classify the components of the Consolidated Application (D).
3. Recognize the components of the Consolidated Application manual (D).
4. Correctly clarify the Evidence-Based Interventions for students using federal
funds based on the comprehensive needs assessment (P).
5. Apply procedures of reasonable, allowable, and necessary tests to federal Title
programs (P).
6. Select and prioritize needs assessment based on local context (P, M).
7. Reflect on the effectiveness of Title programs and determine how they relate to
comprehensive needs assessment (D, M).
8. Design priorities and monitoring initiatives throughout the year with benchmarks
of progress (P, M).
134
9. Monitor the effectiveness of evidence-based interventions with the use of
journaling (M).
10. Attribute success to one’s own effort (Attribution).
11. Belief that the applicant is capable of submitting an application for federal
funding and receiving approval upon the first submission (Self-efficacy).
Training program. The learning goals listed in the previous section will be achieved
with a training program based in a blended learning environment. The components of the
program will involve online learning modules, individual technical assistance supports, and adult
learning principle workshops at the SEA level. The purpose of the training is to ensure
compliance of statutory requirements as outlined by the Every Student Succeeds Act (2018) as
well as aligning the funding with evidence-based outcomes to meet the intent of ESSA funding
to provide equitable outcomes for historically underserved students (ESSA, 2018).
Evaluation of the components of learning. Kirkpatrick and Kirkpatrick (2016) discuss
closing the gap between learning and behavior for personnel who understand the required
knowledge and skills but do not perform the job despite their understanding. Additionally,
confidence and commitment are components of participation in the training. The declarative,
procedural, and metacognitive knowledge components to solving problems and working through
the consolidated application for this study. It is also important for the participants to value the
training. Table 28 outlines the evaluation methods and timing for the knowledge, skills, attitude,
confidence, and commitment components of learning (Kirkpatrick & Kirkpatrick, 2016).
Table 28
Evaluation of the Components of Learning for the Program
Method(s) or Activity(ies) Timing
135
Declarative Knowledge “I know it.”
Formative assessment of the statutory
components of the consolidated application
through multiple-choice questions.
In the asynchronous module after provided
with video demonstrations, readings, and
modeling.
Knowledge checks in the consolidated
application manual in a small group discussion
with peer districts.
Virtual meetings or in-person to determine
gaps in understanding of guidance to LEA.
Procedural Skills “I can do it right now.”
Workshop for Evidence-Based interventions for
students using federal funds based on
comprehensive needs assessment.
After the first statutory requirement
module, virtual pairing by SEA for LEA.
Modules of allowable practices under Title
Programs.
Asynchronous portions of the course,
taken as needed.
Attitude “I believe this is worthwhile.”
Prioritize needs assessment based on local
context and work with peer school districts with
similar size and needs.
During the workshop
Discussions of the effectiveness and value of
what the Title programs are doing and how they
relate to comprehensive needs assessment
During the workshop, synchronous
sessions offered through virtual video
discussions and embedded in module.
Confidence “I think I can do it on the job.”
Benchmark priorities and monitoring initiatives
throughout the year as a result of training
guidelines.
Before, during, and after
Discussions of planning and needs assessment.
During the workshop
Retrospective pre and post feedback, next steps,
and needs for the following fiscal year.
After the course
Commitment “I will do it on the job.”
Prioritize Evidence-Based Interventions for use
of federal funds.
Before, during, and after course
After the training
136
As a result of training, set up benchmarking
technical assistance time with SEA for further
troubleshooting.
Level 1: Reaction
Table 29 shows the methods and tools used to measure reactions to the program as well
as the timing in which the methods will be utilized.
Table 29
Components to Measure Reactions to the Program
Method(s) or Tool(s) Timing
Engagement
Attendance Records
Ongoing
Asking meaningful questions
During the workshop
Completion of practice scenarios
During the Workshop
Course Evaluation
Two weeks after the course
Relevance
Pulse check via survey and/or discussion
After every module/lesson/unit and
training/workshop
Anonymous survey
Two Weeks after the course
Customer Satisfaction
Brief pulse-check with participants via an
online survey
After every module/lesson/unit and
training/workshop
Course Evaluation
Two weeks after the course
Evaluation Tools
Immediately following the program implementation. In the asynchronous aspect of
the course, the learning outcomes will be addressed and benchmarked through the learning
137
analytics tools in the learning management system (LMS). The LMS will be delivered through
the WSDE Federal programs office and will be used to drive instruction and module work within
each instructional component of the suggested training. In addition, the data collection will be
linked to a district portfolio for each LEA. The data collected will include data about the start,
duration, and completion of modules by the participants, but will also organize data points into
manageable components into the district “dashboard” in salesforce or integrated into current
WSDE systems. The data points to be collected will include grant funding sources currently
utilized, the comprehensive needs assessment, as well as specific contextual information that is
unique to the district. The portfolio will help grant reviewers and support personnel with future
decisions regarding effective programming, research-based methods appropriate for the districts
as well as evidence-based interventions that could be beneficial to the targeted population at the
district. Finally, the tracker for contextual needs and size of districts will help to pair peer
districts for support. Further functionality could be considered for district personnel to dialogue
with other districts with similar needs.
The further functionality would include the district’s ability to engage in online or in-
person networking opportunities in small (2-6 LEAs) cohorts. The cohorts would work within
the context of their own size and scope of projects identified with federally funded initiatives.
The goals of the cohorts would be to engage in problem of practice workgroups with dialogue
leading to actionable outcomes around research and evidence-based methodology to help close
achievement gaps in the LEA and modeling promising practice. The ultimate goal of modeling
promising practices with selected peer LEAs within a cohort would be to meet the organizational
and stakeholder auxiliary goal of “moving from compliance to effective practices.”
138
For Level 1 training, as outlined by Kirkpatrick and Kirkpatrick (2016), the facilitator of
the online training and workshop will need to conduct frequent checks to ensure that the training
is favorable, engaging, and relevant to the work the participant is engaging in. This will be
conducted through frequent pulse-checks in the form of informal discussions as well as end of
lesson/unit short survey questions. Level 2 checks will relate more to the value and commitment
of the participants and therefore will require checks for understanding using scenarios drawn
from the district context or from relevant examples. The LEA will also need to conduct frequent
discussions with the regional network representative to ensure comprehension and commitment
of the content with which the participants are engaging. Appendix A provides an example of the
training survey for immediately following the training.
Delayed for a period after the program implementation. Utilizing the Blended
Evaluation approach, the LEA will administer a survey approximately 4 weeks following the
successful implementation of the training. The Blended Evaluation survey will contain open and
scaled items to measure from the participant’s perspective. The Blended Evaluation survey will
include the degree to which participants find the training favorable, engaging, and relevant to
their work (Level 4), the confidence, value, and commitment they need to move forward in
applying what they have learned on the job (Level 3), the degree of which the participants apply
or change their behavior on the job (Level 2), and the degree to which the targeted program
outcomes occur and contribute to organizational priorities, as well as priorities outlined in statute
by the Every Student, Succeeds Act. Appendix B will be used for delayed use after training.
Data Analysis and Reporting
The Level 4 goal of this implementation plan is to provide LEA and SEA representatives
with the knowledge, motivation, and organizational support to ensure that 100% of federal
139
funding applications from the LEA to the SEA are approved within the 30-day timeframe and
lead to effective practices for historically underserved students. Multiple metrics of school and
district performance measures that are linked to federally funded initiatives and will be used to
measure the effectiveness of K-12 programs linked to federal funds. The following data
visualization presents the findings in an accessible manner that can be clearly and quickly
understood by the stakeholders. The Western State Department of Education has a very robust
data and analytics department that currently utilizes Tableau software to drive data analysis and
reporting. This could be leveraged and utilized to build out further dashboards to display peer
district performance as well as federal funding initiatives metrics. The dashboards would be
paired against federally funded initiatives, metrics that show progress in these initiatives and
paired against peer districts with similar contexts. By having all of the LEA information in one
“portfolio,” support services and needs assessments are much more streamlined and underwood
by all stakeholders. This also presents the information in a visually engaging, creative, and
interesting way that will foster stakeholder engagement. A sample of a district performance
dashboard is provided in Figure 7.
140
Figure 7. Sample District Performance Dashboard.
In addition to a portfolio outlining initiatives, funding strategies, assets, and leveraging
practices, a simpler contextual map helps to guide stakeholder’s in their understanding of the
state’s landscape as it relates to funding initiatives. Tufte (2006) describes data visualization as
an exploration of visual data and figures. The evidence lies in the messaging of the figures and
therefore visual representations should be accessible and easily identifiable for the reader.
141
Figure 8 presents the context for stakeholders in a visually appealing manner. This infographic
presents funding amounts, student counts, human capital, and in an accessible manner to provide
a snapshot of the state.
Figure 8. Western State Department of Education Federal Funding Context.
Strengths and Weaknesses of the Approach
The Clark and Estes (2008) gap analysis approach coupled with the Kirkpatrick and
Kirkpatrick (2016) New World Kirkpatrick model offered a comprehensive approach for
solutions, recommendations, implementation, and evaluation. The gap is derived from what
resides within the current reality of the organizational goals and the desired goals. The gap
142
exists in the space between the reality and the goal. The causes that exist between the goal and
current reality are defined as influences. This methodology was effective for this evaluative
study because it divides the influences into three dimensions, knowledge, motivation, and
organization factors. These influences are then taken into causes, assets, or needs, dependent
upon the type of study. In this study, the influences combined with a review of literature
contributed to the root causes and potential recommendations. The structure of the gap analysis
model was effective for the purposes of this study. With this in mind, other types of root cause
analyses could have been employed and could have proved to be equally effective such as a
driver diagram, logic model, theory of action or implementation, or other root cause analyses.
This study benefited from the gap analysis model as the influences lent themselves to proposed
gaps in understanding, motivation, and organizational elements at both the SEA and LEA levels.
The Kirkpatrick and Kirkpatrick (2016) model effectively met the needs of the study as
an approach towards evaluation through intentional planning of the three major dimensions of
the model including planning, execution, and demonstration of value. This model integrates well
with the gap analysis because it takes identified gaps and designs interventions using blended
evaluation tools essential to further determine the needs of training initiatives. The only
potential limitation that would be considered when implementing the program outlined in this
study would include considering the resources required to approach this work effectively. There
would need to be a very concerted effort to ensure that resources including time, monetary, and
human-related are considered when designing interventions because the recommendations would
require all three to be implemented with fidelity. This would take a commitment from all
stakeholders involved to have a return of investment as outlined by Kirkpatrick and Kirkpatrick
143
(2016). Often, in the public sector, these three resources are limited and require buy-in from all
parties involved, including policymakers.
Limitations and Delimitations
There were several limitations of this study, many presented themselves at the apex of the
data collection and after Institutional Review Board (IRB) approval. Some other limitations
developed over the two years in which this study progressed. The two stakeholder groups for
this study, state education grant reviewers and local school district grant administrators were
selected due to their positions at each respective agency. The sample size for the quantitative
survey component with LEA participants was rather high as there were two or more people
represented for each school district to be selected. Another limitation was that only two groups
were selected. A larger population and sample size would help further triangulate data and
provide deeper insight into the research. Additionally, there was a relatively small sample size
relative to the survey for the SEA grant review representatives. This study may have been
affected by participation bias as a result of voluntary participation from participants. According
to the literature, bias is any phenomenon that results in survey responses not reflecting the true
feelings of the respondent (Pazzaglia et al., 2016). The study focused on the processes involved
at both the state and local side for federally funded grant approval. As this was the subject
matter, participants that expressed interest could potentially have held an inherent bias in their
opinions on grant administration at both the state and local levels.
Finally, throughout the data collection and analysis, a global pandemic occurred that
potentially influenced many facets of the study including, but not limited to: staffing concerns,
anxiety around funding sources, well-being of family members and coworkers, budgetary
concerns, school closures, reconfiguring workspaces, remote learning, working from home, and
144
other tangible and intangible variables involved with the pandemic. For this reason, the
qualitative surveys involved response questions for each SEA grant reviewer and the response
rate was lower than expected from the LEA grant reviewers potentially due to the effects of the
pandemic.
Recommendations for Future Research
This study evaluated nine validated influences contributing to both the SEA and LEA
stakeholder goal to approve federal consolidated applications for K-12 funding within a thirty-
day period. While this study addressed gaps in knowledge, motivation, and organizational
influences affecting the organization accomplishing its goal, the implications of the research
could be further developed and realized through further investigation. As mentioned in chapter 2
of this study, there is limited research related to the subject matter of this study. Because of this,
opportunities exist to continue researching federal grant application approval and barriers that
exist in this process. The stakeholders surveyed in the study represented a small subset of all of
the individuals involved in grant development, implementation, monitoring, and review. In the
future, additional stakeholders would strengthen this research as other perspectives are explored.
It also became apparent through data collection that the context of grant development and review
was very dependent upon the size and scale of the school districts submitting the application. In
future studies, it is recommended to differentiate between the different sizes and scope of school
districts. One other consideration as it relates to scope would be to include more interviews from
the communities and families in which the funds serve. The opinions and feedback of the
primary recipients of the funds would provide rich data for analysis and recommendations for
further planning.
145
The researcher would recommend further comparative studies that crosswalk state
agencies to see how processes vary across larger subsections of the United States. Additional
grant review research could cross borders to explore innovative solutions from other countries
through improvement, innovation, or promising practice models of research. Another embedded
subject matter in the discourse of this study involved state and local agencies to consider
evidence and research-based solutions using leveraged funding sources to go beyond compliance
to effective practices for students. Further considerations of leveraging funding sources to create
equitable outcomes for historically underserved students could also enhance future research.
Conclusion
The goal of the WSDE federal programs unit as it relates to the problem of practice is for
100% of districts to show final approval on the consolidated application for federal funding
within the 30-day approval cycle by July 31
st
, 2022. In conjunction with this goal and the intent
of ESSA legislation, WSDE seeks to enable LEAs to use federal funding to improve the
academic and linguistic outcomes for historically underserved student populations. According to
the education commissioner of the WSDE, the state agency seeks to provide “equity and
opportunity for every student, every step of the way” (K. [Redacted], personal communication,
October 21
st
, 2019). This study utilized the Clark and Estes (2008) Gap Analysis framework
considering knowledge, motivation, and organizational influences that lead to organizational
performance gaps.
Once gaps were established through the review of literature coupled with surveys both
the LEA and SEA stakeholders, the findings were identified through data collection and analysis.
Recommendations were proposed to close knowledge, motivation, and organization gaps
affecting the organization’s ability to accomplish the goal. With this organizational goal in
146
mind, the Kirkpatrick and Kirkpatrick (2016) New World Model was used to craft an
implementation and evaluation plan to incorporate the proposed recommendations successfully.
This model was selected due to its advantages of integrating evaluation and implementation and
the ultimate return on expectations. The model requires defining program outcomes and
identifying critical behaviors and required drivers that define what employees need to do on the
job to maximize organizational results that stakeholders seek to influence. The three phases of
the program include planning, execution, and demonstration of value. Once the intervention is
established and designed, the blended evaluation tools are used to connect the levels of
evaluation and use the elevation data to create actionable recommendations for the execution
phase. Finally, the return of investment comes out of the training value reported through the use
of the recommended dashboard coupled with further reporting tools. As Kirkpatrick and
Kirkpatrick (2016) note, the return of investment is the ultimate indicator of value which is
brought forth through relationships with necessary stakeholders and compelling evidence as a
result of the evaluation program.
The Elementary and Secondary Education Act of 1965 was born from a tumultuous
landscape where students from poverty, highly affected race groups, and other national origins
were not receiving equitable academic outcomes. ESEA was designed to take on one of
society’s greatest challenges and that is to ensure that everyone has an equal opportunity to
participate in the future. Part of this conversation is funding for our schools. Now, more than
ever, education demands creativity in order to improve achievement gaps and create linguistic
and academic outcomes for historically underserved students.
Federally funded programs presented a new horizon for the education of disadvantaged
children marking a new era in education reform. This is evident within the first line of the Act
147
charging educators “to provide all children significant opportunity to receive a fair, equitable,
and high-quality education, and to close educational achievement gaps.” With the adoption of
the Every Student Succeeds Act (ESSA) in 2015, the federal government provides school
districts with the ability to design equitably-driven initiatives under the guidance of the state
agency. While the initiatives under ESSA are intended to prepare students for the rigorous
demands of the 21
st
century, there is a general lack of literature and research to prove that this is
taking place. As a state agency, it is still unclear whether the federally funded initiatives are
leading to improved outcomes for historically underserved students.
My goal for this study is to disrupt the massive puzzle of federal funding for K-12 public
schools and assist in designing equitable outcomes for historically underserved students. By
streamlining grant administration, we can provide funding to students in a timely manner and
start to focus more on research-based and equitable outcomes for students.
148
References
Agocs, C. (1997). Institutionalized resistance to organizational change: Denial, inaction, and
repression. Journal of Business Ethics, 16, 917-931.
Aguinis, H., & Kraiger, K. (2009). Benefits of training and development for individuals and
teams, organizations, and society. Annual Review of Psychology, 60, 451–474. doi:
10.1146/annurev.psych.60.110707.163505
Anderman, E. & Anderman, L. (2006). Attributions. Retrieved from http://www.education.
com/reference/article/attribution-theory/.
Aragon, S., Griffith, M., Wixom, M. A., Woods, J., & Workman, E. (2016). ESSA: Quick
Guides on Top Issues. Education Commission of the States.
Baker, L. (2006). Metacognition. Retrieved from http://www.education.com/reference/article/
metacognition/.
Bandura, A. (2005). The evolution of social cognitive theory. In K. G. Smith & M. A. Hitt
(Eds.), Great Minds in Management (pp. 9–35). Oxford: Oxford University.
Bensimon, E., & Neuman, A. (1993). Leadership by teams: The need, the promise, and the
reality. In Redesigning collegiate leadership: Teams and teamwork in higher education
(pp. 1-13). Baltimore, MD: The Johns Hopkins University Press.
Berbary, D. & Malinchak, A. (2011). Connected and engaged: The value of government
learning. The Public Manager, Fall, 55-59.
Berger, B. (2014). Read my lips: Leaders, supervisors, and culture are the foundations of
strategic employee communications. Research Journal of the Institute for Public
Relations, 1(1).
149
Borman, G. D. (2000). Title I: The evolving research base. Journal of Education for Students
Placed at Risk (JESPAR), 5(1-2), 27-45. doi:10.1080/10824669.2000.9671378
Bowen, G.A. (2009). Document analysis as a research method. Qualitative Research Journal
9(2).
Carnevale, A., Smith, N., & Strohl, J. (2013). Recovery: Projections of jobs and education
requirements through 2020. Retrieved from https://1gyhoq479ufd3yna29x7ubjn-
wpengine.netdna-ssl.com/wp-content/uploads/2014/11/Recovery2020.FR_.Web_.pdf
Civil Rights Act of 1964 § 6, 42 U.S.C. §2000e et seq (1964). Retrieved from:
http://www.eeoc.gov/laws/statutes/titlevii.cfm.
Clark, R. E. & Estes, F. (2008). Turning research into results: A guide to selecting the
right performance solutions. Charlotte, NC: Information Age Publishing, Inc.
Cook-Harvey, C. M., Darling-Hammond, L., Lam, L., Mercer, C., & Roc, M. (2016). Equity
and ESSA: Leveraging educational opportunity through the Every Student Succeeds Act.
Washington, DC: Learning Policy Institute.
Creswell, J.W.& Creswell, J. D.(2018). Research design: Qualitative, quantitative, and
mixed methods approaches. Thousand Oaks, CA: Sage Publications.
Darling-Hammond, L., Bae, S., Cook-Harvey, C. M., Lam, L., Mercer, C., Podolsky, A., &
Stosich, E. L. (2016). Pathways to new accountability through the Every Student
Succeeds Act. Palo Alto, CA: Learning Policy Institute.
Dennis, D. V. (2017). Learning from the past: What ESSA has the chance to get right. The
Reading Teacher, 70(4), 395-400.
Dynarski, M. (2015). Using research to improve education under the Every Student Succeeds
Act. Evidence Speaks Reports, 1(8), 1-6.
150
Education Department Grant Administrative Regulations. The administrator's handbook on
EDGAR (4th ed.). Washington, DC: Brandylane Publishers.
Elementary and Secondary Education Act of 1965 [As Amended Through P.L. 115-141,
Enacted March 23, 2018]. Retrieved from: http://www.gadoe.org/School-
Improvement/TeacherandLeaderEffectiveness/Documents/Title%20II%2C%20Part%20A
%20Documents/Guidance/ESSA_03.23.18.pdf.
Ellerson, N. M. (2012). Weathering the Storm: How the Economic Recession Continues to
Impact School Districts. Report of Findings. American Association of School
Administrators.
Erez, M. & Gati, E. (2004). A dynamic, multi-level model of culture: From the micro level of the
individual to the macro level of global culture. Applied Psychology: An International
Review, 53(4), 583-598.
Fink, A. (2013). How to conduct surveys: A step-by-step guide. (5th ed.). Thousand
Oaks: Sage.
Fránquiz, M. E., & Ortiz, A. A. (2016). Co-editors’ introduction: Every student succeeds Act—A
policy shift. Bilingual Research Journal, 39(1), 1-3.
doi:10.1080/15235882.2016.1148996
Gallimore, R. & Goldenberg, C. (2001). Analyzing cultural models and settings to connect
minority achievement and school improvement research. Educational Psychologist,
31(1), 45-56.
Glesne, C. (2011). Chapter 6: But is it ethical? Considering what is “right.” In Becoming
qualitative researchers: An introduction (4th ed.) (pp. 162-183). Boston, MA: Pearson.
151
Grossman, R., & Salas, E. (2011). The transfer of training: what really matters. International
Journal of Training and Development, 15(2), 103–120. https://doi.org/10.1111/j.1468-
2419.2011.00373.x
Heinrich, C. J. (2002). Outcomes–based performance management in the public sector:
implications for government accountability and effectiveness. Public Administration
Review, 62(6), 712-72.
Heise, M. (2017). From no child left behind to every student succeeds: Back to a future for
education federalism. Columbia Law Review, 117(7), 1859-1896.
Hill, S., Kogler, T., & Keller, L. (2009). A collaborative, ongoing university strategic planning
framework: Process, landmines, and lessons. Planning for Higher Education, 37(4), 16-
26.
Jiménez-Castellanos, O., López, P. D., & Rivera, M. (2019). The politics of K–12 local control
funding and accountability for Latinx and ELL students: Lessons learned from California.
Peabody Journal of Education, 94(2), 115-121. doi:10.1080/0161956X.2019.1598099
Johnson, B., & Christensen, L. (2014). Educational research: Quantitative, qualitative,
and mixed approaches. Sage.
Jones, D., Khalil, D., & Dixon, R. D. (2017). Teacher-advocates respond to ESSA: “Support the
good Parts—Resist the bad parts”. Peabody Journal of Education, 92(4), 445-465.
doi:10.1080/0161956X.2017.1349479
K. [Redacted], personal communication, October 21
st
, 2019
Kainz, K. (2019). Early academic gaps and Title I programming in high poverty, high minority
schools. Early Childhood Research Quarterly, 47, 159-168.
doi:10.1016/j.ecresq.2018.08.012
152
Kelley, H. H., & Michela, J. L. (1980). Attribution theory and research. Annual review of
psychology, 31(1), 457-501.
Kezar, A. (2001). Theories and models of organizational change. Understanding and
facilitating organizational change in the 21st century: Recent research and
conceptualizations. ASHE-ERIC Higher Education Report Volume 28(4), 25-58.
Kirkpatrick, D. L. (2006). Seven keys to unlock the four levels of evaluation. Performance
Improvement, 45, 5–8.
Kirkpatrick J. (2008). The new world level 1 reaction sheets. Retrieved from http://www.
kirkpatrickpartners.com/Portals/0/Storage/The%20new%20world%20level%201%20reac
tion%20sheets.pdf.
Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick’s four levels of training evaluation.
Alexandria, VA: ATD Press.
Kirschner, P., Kirschner, F., & Paas, F. (2006). Cognitive load theory. Retrieved from
http://www.education.com/reference/article/cognitive-load-theory/.
Korsgaard, M., Brodt, S., & Whitener, E. (2002). Trust in the face of conflict: The role of
managerial trustworthy behavior and organizational context. Journal of Applied
Psychology, 87(2), 312-319.
Krathwohl, D. R. (2002). A revision of Bloom’s Taxonomy: An overview. Theory Into Practice,
41, 212–218. doi:10.1207/s15430421tip4104_2.
Krueger, R. A., & Casey, M. A. (2009). Focus groups: A practical guide for applied
research (4th ed.). Thousand Oaks, CA: SAGE Publications.
Lewis, L. K. (2011). Organizational change: Creating change through strategic communication
(Vol. 4). New York, NY: John Wiley & Sons.
153
Mathis, W. J., & Trujillo, T. M. (2016). Lessons from NCLB for the Every Student Succeeds
Act. National Education Policy Center.
Maxwell, J. A. (2013). Qualitative research design: An interactive approach. (3rd ed.).
Thousand Oaks: Sage.
McEwan, E. K., & McEwan, P. J. (2003). Making sense of research. Sage.
Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and
implementation (4th ed.). San Francisco, CA: Jossey-Bass.
N. [Redacted], personal communication, February 21
st
, 2019.
National Center for Education Statistics (2015). Trends in high school dropout and completion
rates in the United States: 1972-2012. Washington, DC: US Department of Education.
Retrieved from https://nces.ed.gov/pubs2015/2015015.pdf
Next steps for K-12 education: upholding the letter and intent of the Every Student Succeeds Act:
hearing before the Committee on Education and the Workforce, U.S. House of
Representatives, One Hundred Fourteenth Congress, second session, hearing held in
Washington, DC, February 25, 2016. Washington: U.S. Government Publishing Office.
Ohio Department of Education. (2015). Compliance consolidated ESEA grants self-survey.
Retrieved from
https://ccip.ode.state.oh.us/documentlibrary/ViewDocument.aspx?DocumentKey=79590
Ozcan, Y. A. (2008). Health care benchmarking and performance evaluation. An
Assessment using Data Envelopment Analysis (DEA), (Ed.). Springer Science Business
Media, New York, 4.
Pajares, F. (2006). Self-efficacy theory. Retrieved from
http://www.education.com/reference/article/self-efficacy-theory/.
154
Patton, M. Q. (2002). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks:
SAGE Publications.
Pier, E., Brauer, M., Filut, A., Kaatz, A., Raclaw, J., Nathan, M., Ford, C., & Carnes, M. (2018).
Low agreement among reviewers evaluating the same NIH grant applications.
Proceedings of the National Academy of Sciences - PNAS, 115(12), 2952–2957.
https://doi.org/10.1073/pnas.1714379115
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in
learning and teaching contexts. Journal of Educational Psychology, 95, 667–686.
doi:10.1037/0022-0663.95.4.667
Rioux, J. W. (1965). New opportunities economic opportunity act and elementary and secondary
education act of 1965. Childhood Education, 42(1), 9-11.
doi:10.1080/00094056.1965.10729032
Robinson, S.B. & Firth Leonard, K. (2019). Designing quality survey questions. Los Angeles:
SAGE.
Rubin, H. J., & Rubin, I. S. (2012). Chapter 6: Conversational partnerships. In Qualitative
interviewing: The art of hearing data (3rd ed.) (pp. 85-92). Thousand Oaks, CA: SAGE
Publications.
Rueda, R. (2011). The 3 dimensions of improving student performance. New York: Teachers
College Press.
Salkind, N. J. (2017). Statistics for people who (think they) hate statistics: Using Microsoft Excel
2016 (4th ed.). Thousand Oaks, CA: SAGE.
Schein, E.H. (2017). Organizational culture and leadership, 5th Edition. San Francisco: Jossey-
Bass.
155
Senge, P. (1990). The leader’s new work: Building learning organizations. Sloan
Management Review, 32(1), 7-23.
Senge, P. (1999). Creative tension. Executive Excellence, 12–13. Retrieved from
http://search.proquest.com/docview/204611455/
Stringer, E. T. (2014). Action Research. (4th ed.). Los Angeles: SAGE.
Thomas, J., & Brady, K. (2005). The Elementary and Secondary Education Act at 40: Equity,
Accountability, and the Evolving Federal Role in Public Education. Review of Research
in Education, 29, 51–67.
Tufte, E. (2006). Beautiful evidence. Graphics Press.
United States department of education: Office for civil rights. (2017). Retrieved from
https://www2.ed.gov/about/offices/list/ocr/index.html
United States Department of Education. (2017). Use of funds overview for the ESSA
consolidated application programs Retrieved from https://doe.sd.gov/ofm/documents/17-
%20ESSAspend.pdf
Wanker, W. P., & Christie, K. (2005). State implementation of the N Child Left Behind Act.
Peabody Journal of Education, 80(2), 57-72. doi:10.1207/S15327930pje8002_4
Weiss, R. S. (1994). Learning from strangers: The art and method of qualitative interview
studies. New York, NY: The Free Press.
Zak, P. (2017). Trust factor: The science of creating high-performance companies. Amazon.
156
APPENDIX A
Instructions:
Thank you for your participation in this survey. The purpose of this study is to analyze the need
for school districts to receive federal funding and what issues are involved with the application’s
final approval of Every Student Succeeds Act (ESSA) funding from the state agency. As a grant
administrator in your district, your feedback is helpful to understand the training and review
process of the state agency in relation to the school district. Your answers will remain
anonymous and your participation is entirely voluntary. This research is being conducted by a
doctoral student from the University of Southern California. The responses to your answers will
be stored on a local hard drive and housed in the secure storage in the WSDE federal programs
unit.
Survey Items
Question Block 1: Please answer the following to provide background as it relates to your
position in your school district.
Scale of Measurement: Continuous (Ratio) and Nominal
Potential Analyses: Percentage, Frequency, Mode, Median, Mean, Standard Deviation, Range
Research Question/
Data Type
KMO
Construct
Survey Item (question and response)
Demographics
NA I have worked for this school district for ____ years.
(Drop down list)
Demographics
NA I have worked as an authorized representative of federal
funds for ____ years.
(Drop down list)
Demographics
NA I am:
Male
Female
Other preferred description
Prefer not to say
(Drop down list)
Research Question 2
K-P
On average, how many hours are spent in the
development of the Consolidated Application?
(Drop down list of hours)
157
Research Question 2
K-P
On average, how many hours are spent in the
resubmission of the Consolidated Application?
(Drop down list of hours)
Research Question 2
O-CM
Indicate how many iterations of last year’s consolidated
application did you encounter. (Iteration would include
resubmission after reviewer comment from the state
education agency)
(Drop down list of number of iterations)
Question Block 2: Please rate how you feel regarding the following statements (Likert Scale).
Scale of Measurement: Ordinal
Potential Analyses: Percentage, Frequency, Mode, Median, Range
Research Question 2 K-D I am familiar with the Consolidated Application manual for
federal programming.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 K-D I use the Consolidated Application manual frequently to help
inform my application for federal funds.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 K-D I am familiar with the Year at A Glance Document.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
158
Research Question 1 K-D I am familiar with the ESSA State Plan
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 1 K-M I am familiar with the Comprehensive Needs Assessment.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 1 K-M I feel that my efforts to receive final approval on the
Consolidated Application will lead to equitable outcomes for
students.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 M-A My success on the application is dependent on my
knowledge of the consolidated application for federal
funding.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 M-A My success on the application is dependent on the reviewer
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
159
Research Question 1 M-SE I am confident in my ability to create an allowable activity
under ESSA statute in the consolidated application for
federal funding.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 M-SE I am confident in my ability to submit an application for
federal funding that receives substantial approval”
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 M-SE I am confident that my use of federal funds is meeting the
needs of my LEA, both academically and linguistically.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 2 M-A I am confident that my submission of the Consolidated
Application for federal funds will receive 100% final
approval upon the first submission.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 1 O-CM Review comments are
helpful from the state agency in order for my school district
to receive 100% final approval on the consolidated
application for federal funding.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
160
Research Question 1 O-CS My district relies on the State Education Agency to provide
valuable feedback for a consolidated application?
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 1 O-CM
I believe Network Meetings help you to submit an
application for 100% final approval from the state agency
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
Research Question 1 O-CM
I plan to attend the next Regional Network Meeting.
__ Strongly Agree
__ Agree
__ Undecided
__ Disagree
__ Strongly Disagree
K-D=Knowledge-Declarative, K-P=Knowledge-Procedural, K-M=Knowledge-Metacognitive
M-A=Attribution, M-SE=Self-Efficacy,
O-CM=Cultural Models, O-CS=Cultural Settings
161
APPENDIX B
State Education Agency Qualitative Survey
Thank you for your participation in this survey. The purpose of this study is to analyze the need
for school districts to receive federal funding and what issues are involved with the application’s
final approval from the state agency. As a member of the Western State Department of
Education, your feedback is helpful to understand the training and review process of the state
agency in relation to the school district. Your answers will remain anonymous and your
participation is entirely voluntary. This research is being conducted by a doctoral student from
the University of Southern California. The responses to your answers will be stored on a local
hard drive and housed in the secure storage in the WSDE federal programs unit.
Let’s begin with the consolidated application for federal funding. Please provide as much
insight as possible with relation to this portion of your work, if applicable.
1. How well do you understand, if at all, the consolidated application and its contents?
(Knowledge, Patton Experience)
2. How is the SEA helping, if at all, to train the LEA in the aspects of the consolidated
application? (Knowledge)
3. How does the SEA support, if at all, the LEA in planning initiatives? (Knowledge)
Please provide one or more specific details that are related to the planning initiatives for school
districts for federal funding.
4. How does the SEA support LEAs in moving beyond compliance to effective practices, if
at all? (Knowledge, Strauss et al., Ideal)
Please explain what compliance on the federal application means to you?
5. What are some factors that influence successful applications for federal funding?
(Motivation, Straus et al. Interpretive)
6. What roadblocks are there in order to receive a successful application? (Motivation)
Considering the influences and roadblocks to a successful application, please take a moment to
consider the next question.
7. Why (or why not) do you think the LEA is capable of receiving final approval for an
application for federal funding upon the first submission? (Motivation)
Based on your own technical assistance and participation with school districts, please consider
the following.
162
8. To what degree do you feel confident in your ability to review/approve applications for
federal funding? (Motivation, Patton Opinion/Value)
9. Describe the role of the federal program staff in providing feedback to districts
concerning the consolidated application for federal funding. Do you believe that the
review comments are helpful or useful? (Organization, Patton Feeling)
Thank you for your insight. Do you have any more examples you would like to include?
10. Why does every district need review comments in order to submit an application for
federal funding that meets federal compliance measures? (Organization, Straus et al.
Ideal/Interpretive)
11. How do you measure the effectiveness of regional network trainings in assisting LEAs to
submit an application for federal funding with 100% approval? (Organization)
12. Describe the effectiveness of the training opportunities provided by the SEA to the LEA
for federally funded initiatives? (Organization)
163
Appendix C
Please select the rating that most accurately reflects your position on the following statements.
Questions Strongly
Agree
Agree Disagree Strongly
Disagree
1. The training held my interest.
2. I have found value in the training
received.
3. Discussions during training helped
me to understand how to apply what I
have learned.
4. I will recommend this program to
other grant administrators
5. I believe it is important for me to
incorporate what I have learned into
my planning for federal funded
initiatives.
6. I believe it is important for me to
incorporate what I have learned into
my implementation for federal funded
initiatives.
7. The feedback I have received has
increased my confidence to apply
what I have learned to my job.
8. I feel confident about applying
what I learned back on the job
Open-ended Questions:
1. What part of training was most beneficial for you?
2. What part of training was least beneficial for you?
3. What were the major concepts you learned today?
Adapted from Kirkpatrick and Kirkpatrick (2016)
164
Appendix D
Delayed Evaluation Tools Blended Model Approach
(Levels 1-4 Assessment)
Journal Activity:
Please refer to your journal to discuss monitoring of effective evidence-based interventions.
Prepare to discuss this further with a peer in the group.
90 Day Reflection of Learning:
Please select the rating that most accurately reflects your position on the following statements.
Question Strongly
Agree
Agree Disagree Strongly
Disagree
1. The online learning environment
helped contribute to my knowledge on
this subject.
2. My participation was encouraged by
the facilitator.
3. I received helpful information
regarding federal programming.
4. I believe what I learned is worthwhile.
5. I feel confident about applying what I
have learned about federally funded
initiatives to my job.
6. I anticipate I will receive the
necessary support to successfully apply
what I have learned.
165
Critical Behaviors
Behavior Rating Scale: Using this rating scale below, please click on the rating that best
describes your current level of on-the-job application for each listed behavior
1 - Little or no application
2 - Mild degree of application
3 - Moderate degree of application
4 - Strong Degree of Application
5 - Very strong degree of application and desire to help others to do the same
Critical Behavior Objectives
Streamline guidance around the elements and sections of the Consolidated
Application
1 2 3 4 5
Identify misunderstandings in consolidated application 1 2 3 4 5
Shifting state agency role from compliance officer to support agency. 1 2 3 4 5
If you circled 5 or above for the previous questions, rate the contribution of each of the following
factors to your effective performance:
Contributing Factor Rating
The course itself Not at all Low Medium High
Coaching from facilitator Not at all Low Medium High
Support and/or encouragement Not at all Low Medium High
Effective system of accountability or
monitoring
Not at all Low Medium High
Belief that it would help me to be more
effective in my work.
Not at all Low Medium High
Ongoing training I have received after the
initial class
Not at all Low Medium High
Payment of bonus or other incentives Not at all Low Medium High
Community of practice or other peer support Not at all Low Medium High
166
Job aids Not at all Low Medium High
Other (please specify) Not at all Low Medium High
If you circled 4 or below, please indicate the reasons (check all that apply)
⧠ I do not have the necessary knowledge and skills.
⧠ I do not have a clear picture of what is expected of me.
⧠ I have other, higher priorities.
⧠ I do not have the necessary resources to apply what I learned.
⧠ The training didn’t give me the confidence to apply what I learned.
⧠ I don’t think what I learned will work.
⧠ There is not an adequate system of accountability to ensure application of what I learned.
⧠ Other (please explain)
Open-Ended Questions:
1. How are you currently using what you learned during the federal compliance to effective
practices training?
2. What positive outcomes are you seeing as a result of what you are doing?
3. To what can you attribute that success?
4. If you are not using the skills you have learned during the training, what are the reasons?
5. Looking back, what would you change about this course?
6. How have you used what you learned in training on the job?
Adapted from Kirkpatrick and Kirkpatrick (2016)
Abstract (if available)
Abstract
The passage of the Elementary and Secondary Education Act as reauthorized by the Every Student Succeeds Act have afforded opportunities to close achievement gaps for historically underserved students. The purpose of this study is to examine the need for Local Education Agencies to receive final approval for K-12 public education federal funding applications. The project goal is to develop and implement training delivered by the State Education Agency to help the LEA achieve 100% approval of the Consolidated Application for federal funds. ❧ The research questions used to guide this study are (1) What are the knowledge, motivation, and organizational needs necessary for the LEA to achieve 100% approval (compliance) of consolidated applications for federal funding
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Efficacy of non-formal education programs in educational outcomes of marginalized Filipino children: an evaluation study
PDF
Professional development and building diversity inclusive educational settings in South Korea: an innovation study
PDF
Sustained mentoring of early childhood education teachers: an innovation study
PDF
Mitigating low employee engagement through improved performance management: an evaluation study
PDF
Leadership in an age of technology disruption: an evaluation study
PDF
Implementing comprehensive succession planning: an improvement study
PDF
Teacher role in reducing the achievement gap: an evaluation study
PDF
Readiness factors influencing the ability of institutions of higher education to align the budget with organizational goals: an evaluation study of a college
PDF
Examining the impact of LETRS professional learning on student literacy outcomes: a quantitative analysis
PDF
Collaborative instructional practice for student achievement: an evaluation study
PDF
Gender inequity and leadership in the large state militia: an innovation study
PDF
Addressing hospital readmissions as a managed care model within a federally qualified health center: an evaluation study
PDF
Teacher retention in an urban, predominately Black school district: an improvement study in the Deep South
PDF
Establishing a systematic evaluation of positive behavioral interventions and supports to improve implementation and accountability approaches using a gap analysis framework
PDF
Equity and sustainability in career education funding: a gap analysis
PDF
Equitable schooling for African American students: an evaluation study
PDF
Perceptions of first-year tertiary students’ English language learning after six years of instruction in Japan: an evaluation study
PDF
Developing socially intelligent leaders through field education: an evaluation study of behavioral competency education methods
PDF
Academic coaching for Pell-eligible, academically at-risk freshmen: an evaluation study
PDF
Approaches to teaching for twenty-first century learners in South Korea: An evaluation study of GSS
Asset Metadata
Creator
Thompson, Robert Martin
(author)
Core Title
Federal grant approval at a state education agency: an evaluation study
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Organizational Change and Leadership (On Line)
Publication Date
10/26/2020
Defense Date
09/23/2020
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
access,accountability,attribution,Compliance,equity,ESEA,ESSA,evidence-based,federal funds,NCLB,OAI-PMH Harvest,organization,research-based,self-efficacy
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Min, Emmy (
committee chair
), Kellar, Frances (
committee member
), Krop, Cathy (
committee member
)
Creator Email
Thom434@usc.edu,thompsrm@gmail.com
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-385169
Unique identifier
UC11666541
Identifier
etd-ThompsonRo-9064.pdf (filename),usctheses-c89-385169 (legacy record id)
Legacy Identifier
etd-ThompsonRo-9064.pdf
Dmrecord
385169
Document Type
Dissertation
Rights
Thompson, Robert Martin
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
access
accountability
attribution
equity
ESEA
ESSA
evidence-based
federal funds
NCLB
organization
research-based
self-efficacy