Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Establishing a systematic evaluation of positive behavioral interventions and supports to improve implementation and accountability approaches using a gap analysis framework
(USC Thesis Other)
Establishing a systematic evaluation of positive behavioral interventions and supports to improve implementation and accountability approaches using a gap analysis framework
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 1
ESTABLISHING A SYSTEMATIC EVALUATION OF POSITIVE BEHAVIORAL
INTERVENTIONS AND SUPPORTS TO IMPROVE IMPLEMENTATION AND
ACCOUNTABILITY APPROACHES USING A GAP ANALYSIS FRAMEWORK
by
Maria G. Ruelas
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
August 2018
Copyright 2018 Maria G. Ruelas
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 2
Dedication
To God, without Him there would not be a me.
I thank my God for graciously granting me the opportunity of learning that knowledge is
a treasure, and practicing is the key. I appreciate the lessons I have learned and the countless
blessings I have received throughout the journey.
To my dear mother who I am forever indebted to for loving and believing in me even
before I was born. I am deeply grateful for her optimism, perseverance, and resilience. She is
the ultimate definition of a strong woman, and I thank her for raising me to be one.
To my dear father who I appreciate for trusting and believing in me every step of the
way. He raised me to be a caring, hard-working, and responsible human-being. Because of him
I know that if it was easy, everyone would do it.
To my furry four-legged son, Romeo who greeted me every night with a wagging tail,
turned my frowns upside down, and showered me with selfless love.
To my friends and mentors for understanding, believing, and cheering me on during the
darkest paths on the road.
To my humble communities in Inglewood and Lennox who raised me and taught me that
no dreams are too big, ‘si se pudo’.
To my loudest cheerleader, Dr. Paula Angulo-Landeta who taught me to always look
above and beyond the surface, instilling a strong sense of ambition and drive. Her ‘si se puede’
attitude motivated me to change my family’s legacy by graduating high school and now,
obtaining my doctorate degree. Because of you, I am a confident Latina woman who is devoted
to social justice and social change.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 3
Acknowledgments
To those in the USC Rossier School of Education, your support throughout my
educational journey was immeasurable. Your commitment, encouragement, and continual
support led me to this point in time.
To Dr. Melora Sundt, my committee Co-Chair. Thank you for your knowledge,
expertise, and guidance throughout this journey. Thank you for making me reflect on the
qualities I bring to the table and fostering my growth as a leader.
To Dr. Kenneth Yates, my advisor and committee Co-Chair. Thank you for believing in
me and trusting that I would reach the finish line. Your patience and tenacity to ensure my
success in the program was unparalleled. Thank you for being such an important part of my
story, I will never forget the influence you had on me.
To Dr. Betsy Hamilton, my committee member. Thank you for allowing me to infuse my
journey with joy. I am grateful for your heartfelt support throughout my ‘Jedi training sessions’
and for your ‘do or do not. There is no try’ leadership approach.
Finally, I would like to acknowledge Dr. Stephen Hawking who has made me remember
to look up at the stars, make sense of what I see, and simply not give up. May he rest in peace.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 4
Table of Contents
List of Tables 11
Abstract 15
Chapter One: Introduction 16
Introduction of the Problem of Practice 16
Organizational Context and Mission 16
Organizational Performance Need 18
Related Literature 19
Importance of the Organizational Innovation 21
Organizational Performance Goal 21
Critical Behaviors 22
Stakeholder for the Study and Stakeholder Performance Gap 23
Purpose of the Project and Questions 23
Methodological Framework 24
Definitions 24
Organization of the Proposal 26
Chapter Two: Review of the Literature 27
Conceptual Framework 28
Stakeholder Knowledge, Motivation, and Organizational Factors 28
Assumed Knowledge Influences 29
Declarative Factual Knowledge 29
PBIS Committee Members Need to Know the
Established Goal to Evaluate Compliance
with PBIS Interventions 29
PBIS Committee Members Need to Know the
Components of PBIS 30
PBIS Committee Members Need to Know what
Comprises an Effective Evaluation System
that Can Monitor and Ensure Compliance
of PBIS Interventions 30
Declarative Conceptual Knowledge 31
PBIS Committee Members Need to Know the
Various Categories of PBIS to Measure
Improvements and/or Areas of Growth 31
PBIS Committee Members Need to Know the
Three-Tiered PBIS Model 32
PBIS Committee Members Need to Know the
System of PBIS 32
Procedural Knowledge 33
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 5
PBIS Committee Members Need to Know How
to Create an Evaluation System 34
PBIS Committee Members Need to Know How to
Implement an Evaluation System 35
PBIS Committee Members Need to Know How to
Analyze the Results of the PBIS Evaluation
System 35
PBIS Committee Members Need to be Able to
Incorporate PBIS Interventions Into the
Schools 36
Metacognitive Knowledge 36
PBIS Committee Members Need to Know How
to Reflect on Their Own Abilities to Create
Their Own Comprehensive Evaluation System 36
PBIS Committee Members Need to Reflect on and
Evaluate Their Own Performance 37
Assumed Motivation Influences 39
Value 40
PBIS Committee Members Need to Value Designing and
Implementing a Comprehensive Evaluation System
to Monitor and Ensure Compliance of PBIS
Interventions 41
Self-Efficacy 41
PBIS Committee Members Need to Have Confidence
that They Can Create an Evaluation System that
will Monitor and Ensure Compliance of PBIS
Interventions 42
Mood 42
PBIS Committee Members Need to Feel Positive about
Designing and Implementing a Comprehensive
Evaluation System 43
Assumed Organization Influences 44
Resources 44
PBIS Committee Members Need Time to Create
an Effective Comprehensive Evaluation
System 44
PBIS Committee Members Need Individuals to
Commit to Design, Implement, and Monitor
an Effective Comprehensive Evaluation
System 45
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 6
Policies, Processes, and Procedures 45
PBIS Committee Members Need to Have Policies
that Align with the Goal of the School
District 45
Cultural Setting 46
Summary 48
Chapter Three: Methodology 49
Methodological Framework 49
Assessment of Performance Influences 50
Figure 1. Gap Analysis Process 51
Knowledge Assessment 51
Declarative Factual Knowledge Assessment 51
Declarative Conceptual Knowledge Assessment 58
Procedural Knowledge Assessment 58
Metacognitive Knowledge Assessment 59
Motivation Assessment 59
Value Assessment 60
Self-Efficacy Assessment 63
Mood Assessment 64
Organization, Culture, Context Assessment 64
Assessing Resources 65
Assessing Policies, Processes, and Procedures 68
Assessing Cultural Setting 69
Participating Stakeholders and Sample Selection 70
Surveys 70
Interviews 71
Observation Checklist and Document Analysis
Design 71
Data Collection 72
Surveys 72
Interviews 72
Observations 72
Document Analysis 73
Data Analysis 73
Trustworthiness of Data 73
Role of Investigator 74
Limitations 74
Chapter Four: Results and Findings 75
Participating Stakeholders 75
Data Validation 76
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 7
Criteria for Validation of the Data 76
Survey 77
Knowledge Category 77
Motivation Category 77
Organizational Category 77
Interviews 78
Document Analysis 78
Results and Findings for Knowledge Causes 79
Factual Knowledge 79
Assumed Knowledge Influence #1: PBIS
Committee Members Need to Know the
Established Goal to Evaluate Compliance
with PBIS Interventions 79
Assumed Knowledge Influence #2: PBIS
Committee Members Need to Know the
Components of PBIS 81
Assumed Knowledge Influence #3: PBIS
Committee Members Need to Know what
Comprises an Effective Evaluation System
that Can Monitor and Ensure Compliance
of PBIS Interventions 82
Conceptual Knowledge 84
Assumed Conceptual Influence #1: PBIS
Committee Members Need to Know the
Categories Various of PBIS to Measure
Improvements and/or Areas of Growth 84
Assumed Conceptual Influence #2: PBIS
Committee Members Need to Know the
Three-Tiered PBIS Model 86
Assumed Conceptual Influence #3: PBIS
Committee Members Need to Know the
System of PBIS 88
Procedural Knowledge 90
Assumed Procedural Influence #1: PBIS
Committee Members Need to Know How to
Implement an Evaluation System 90
Assumed Procedural Influence #2: PBIS
Committee Members Need to Know How to
Analyze the Results of the PBIS Evaluation
System 92
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 8
Assumed Procedural Influence #3: PBIS
Committee Members Need to be Able to
Incorporate PBIS Interventions Into the
Schools 94
Metacognitive Knowledge 95
Assumed Metacognitive Influence #1: PBIS
Committee Members Need to Know How
to Reflect on Their Own Abilities to Create
Their Own Comprehensive Evaluation
System 95
Assumed Metacognitive Influence #2: PBIS
Committee Members Need to Reflect on
and Evaluate Their Own Performance 97
Results and Findings for Motivation Causes 99
Value 99
Assumed Motivation Influence #1: PBIS
Committee Members Need to Value
Designing and Implementing a
Comprehensive Evaluation System to
Monitor and Ensure Compliance of PBIS
Interventions 99
Self-Efficacy 103
Assumed Motivation Influence #2: PBIS
Committee Members Need to Have
Confidence that They Can Create an
Evaluation System that will Monitor and
Ensure Compliance of PBIS Interventions 103
Mood 109
Assumed Motivation Influence #3: PBIS
Committee Members Need to Feel Positive
about Designing and Implementing a
Comprehensive Evaluation System 109
Results and Findings for Organization Causes 114
Organization/Culture/Context 114
Resources. Assumed Organization Influence #1:
PBIS Committee Members Need Time to
Create an Effective Comprehensive
Evaluation System 114
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 9
Assumed Organization Influence #2: PBIS
Committee Members Need Individuals to
Commit to Design, Implement, and Monitor
an Effective Comprehensive Evaluation
System 116
Policies, Processes, and Procedures 118
Assumed Organization Influence #3: PBIS
Committee Members Need to Have Policies
that Align with the Goal of the School
District 118
Cultural Settings 120
Assumed Organization Influence #4: PBIS
Committee Members Need to be Part of a
Culture that Supports PBIS Interventions
that Align with the Mission of Investing in
Optimal Learning Environments
to Enhance Safety and Create Positive
School Climates 120
Assumed organization influence #5: PBIS
Committee Members Need to be Part of a
Culture that Values Monitoring and
Accountability, Ensuring that PBIS
Interventions are Being Implemented 122
Chapter Five: Recommendations and Evaluation 128
Purpose of the Study and Questions 128
Recommendations to Address Knowledge, Motivation, and
Organization Influences 128
Knowledge Recommendations 129
Introduction 129
Factual Knowledge Solutions 132
Conceptual Knowledge Solutions 133
Procedural Knowledge Solutions 133
Metacognitive Knowledge Solutions 134
Motivation Recommendations 135
Introduction 135
Expectancy Value 136
Self-Efficacy 137
Organization Recommendations 137
Introduction 137
Cultural Settings 139
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 10
Cultural Models 140
Integrated Implementation and Evaluation Plan 141
Implementation and Evaluation Framework 141
Organizational Purpose, Need, and Expectations 142
Level 4: Results and Leading Indicators 143
Level 3: Behavior 145
Critical Behaviors 145
Required Drivers 145
Organizational Support 147
Level 2: Learning 148
Learning Goals 148
Program 148
Evaluation of the Components of Learning 150
Level 1: Reaction 151
Evaluation Tools 152
Immediately Following the Program
Implementation 152
Delayed for a Period after the Program
Implementation 152
Data Analysis and Reporting 153
Limitations 153
Future Research 154
Conclusion 154
References 157
Appendix A. Training Evaluation Form 162
Appendix B. Survey Items 164
Appendix C. Interview Items 173
Appendix D. Informed Consent/Information Sheet 176
Appendix E. Recruitment Letter 178
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 11
List of Tables
Table 1. Stakeholders and stakeholders’ performance goals 22
Table 2. Summary of assumed knowledge influences on
stakeholders’ ability to achieve the performance goal 37
Table 3. Summary of assumed motivation influences on
stakeholders’ ability to achieve the performance goal 43
Table 4. Summary of assumed organization influences on
stakeholders’ ability to achieve the performance goal 47
Table 5. Summary of knowledge influences and method of
assessment 52
Table 6. Summary of motivation influences and method of
assessment 61
Table 7. Summary of organization influences and method of
assessment 66
Table 8. Survey results for committee members factual
knowledge of the established goal 79
Table 9. Survey results for factual knowledge of the four
components of PBIS 81
Table 10. Survey results for understanding an effective
evaluation system 83
Table 11. Survey results to learn and understand the four
categories to measure areas of growth 85
Table 12. Survey results to learn and understand the four
categories to measure areas of growth 87
Table 13. Survey results for conceptual knowledge of PBIS
committee members 88
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 12
Table 14. Survey results for procedural knowledge of PBIS
committee members 90
Table 15. Survey results for PBIS committee members to
analyze intervention results 92
Table 16. Survey results for procedural knowledge of PBIS
committee members 94
Table 17. Survey results for metacognitive knowledge of PBIS
committee members to self-evaluate by reflecting on
their own abilities 96
Table 18. Survey results for PBIS committee members to reflect
on and evaluate their own performance 97
Table 19. Survey results for motivation of PBIS committee
members 99
Table 20. Survey results for motivation of PBIS committee
members 100
Table 21. Survey results for motivation of PBIS committee
members 101
Table 22. Survey results for motivation of PBIS committee
members 102
Table 23. Survey results for motivation of PBIS committee
members 104
Table 24. Survey results for planning and meeting deadlines 105
Table 25. Survey results for designing a comprehensive
evaluation plan 106
Table 26. Survey results for monitoring an evaluation plan
for PBIS 107
Table 27. Survey results for assessing the effectiveness of
PBIS interventions 108
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 13
Table 28. Survey results for designing a comprehensive
evaluation system 110
Table 29. Survey results for implementing a comprehensive
evaluation system 111
Table 30. Survey results for monitoring a comprehensive
evaluation system 112
Table 31. Survey results for determining effectiveness of
PBIS interventions 113
Table 32. Survey results for organization of PBIS committee
members 115
Table 33. Survey results for organization of PBIS committee
members 117
Table 34. Survey results for organization of PBIS committee
members 119
Table 35. Survey results for organization of PBIS committee
members 121
Table 36. Survey results to instill a culture that values and
recognizes PBIS interventions 123
Table 37. Survey results for organization of PBIS committee
members 124
Table 38. Survey results for organization of PBIS committee
members 125
Table 39. Survey results for organization of PBIS committee
members 126
Table 40. Summary of knowledge influences and
recommendations 129
Table 41. Summary of motivation influences and
recommendations 136
Table 42. Summary of organization influences and
recommendations 138
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 14
Table 43. Outcomes, metrics, and methods for external
and internal outcomes 144
Table 44. Critical behaviors, metrics, methods, and timing
for evaluation 145
Table 45. Required drivers to support critical behaviors 146
Table 46. Evaluation of the components of learning for the
program 150
Table 47. Components to measure reactions to the program 152
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 15
Abstract
Using the Gap Analysis problem-solving framework (Clark & Estes, 2008), this dissertation
examined the design and implementation of an evaluation system to monitor the effectiveness of
all Positive Behavioral Interventions and Supports (PBIS). The primary purpose of this study
was to conduct a needs assessment in the areas of knowledge and skill, motivation, and
organizational challenges that contribute to the Social Emotional Elementary School District’s
(SEESD) gap in accomplishing its organizational goal to develop a comprehensive evaluation
system that will monitor and ensure 100% compliance of PBIS interventions. A mixed-method
approach was used to collect quantitative data from 77 surveys and qualitative data from 24
interviews. These data were used to validate and prioritize assumed causes in knowledge and
skills, motivation, and organizational factors. These findings revealed root causes related to
creating a comprehensive evaluation system, implementing an evaluation system, and analyzing
the data produced by the evaluation system. Proposed solutions were aimed at strengthening
organizational supports, fostering commitment and value, and creating a strong knowledge base
for capacity building. This dissertation demonstrates how stakeholders can systematically apply
the Gap Analysis framework to address performance issues when implementing district-wide
initiatives that will monitor and ensure compliance of PBIS interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 16
CHAPTER ONE: INTRODUCTION
Introduction of the Problem of Practice
Exposure to school violence is a widespread, harmful, and costly public health concern
affecting public schools in America. Schools across the nation have experienced a series of
tragedies that have changed the landscape of school security and sense of safety (Sprague,
Nishioka, & Smith, 2007). According to Fitzgerald and Cohen (2012), 65 percent of American
children have experienced at least one violent event before adulthood. Such traumatic events,
whether at school or in the community, can negatively affect the biological, emotional, social,
and cognitive functioning of children (Dods, 2013). Additionally, Brinamen and Page (2012)
suggested that violence affects school readiness and ability to form positive relationships, thus
impacting student learning. Children exposed to violence are at greater risk for behavioral
problems and educational success, and research revealed that it is critical for school districts to
address student safety by creating positive school climates that promote student learning, and
ensure students’ physical and mental health (Brinamen & Page, 2012; Fitzgerald & Cohen,
2012). Hoyle, Marshall, and Yell (2011) stated it is important for school districts to be held
accountable for the implementation of preventative measures by evaluating programs that are
meant to create a positive school climate and improve school achievement. This study examined
what would be required for a school district to create and implement such an evaluation plan.
Organizational Context and Mission
The Social Emotional Elementary School District (a pseudonym; SEESD) is a preschool
through 8th grade school district located in an urban community in southern California. The
mission of SEESD is to raise the bar and close learning gaps for all students. SEESD also
operates under four focus areas: (a) SEESD will ensure all students learn through access to high
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 17
quality actions and services that increase academic achievement and civic/career/college
readiness; (b) SEESD will provide high quality actions and services to eliminate barriers to
student access to required and desired areas of study; (c) SEESD will ensure all school sites have
safe, welcoming, healthy and inspiring climates for all students and families, so that all students
are behaviorally and academically engaged in school and ready to learn; and (d) SEESD will
invest in optimal learning environments that enhance student learning and ensure safety (SEESD
Parent Handbook, 2016). The third focus area directly aligns with SEESD’s district-wide
mission to raise the bar and close learning gaps for all students by creating safe and positive
school environments.
There are approximately 33,000 residents living in SEESD area. Although the school
district has affluent city neighbors, SEESD is categorized as an urban, low-income community
characterized by high rates of poverty, single parent/female-headed households, and high
unemployment. The school district is a student-centered, diverse, elementary school district
serving over 6,000 students including 2,115 English Language Learners (ELL). SEESD is
comprised of six elementary schools and two middle schools. The student demographics consist
of approximately 75% Hispanic, 10% African-American, 5% Asian, 5% Caucasian, and 5%
other/unspecified race (CDE, 2017). Presently, 82% of the students participate in the Free and
Reduced Lunch Program, which is a federal measure of poverty (CDE, 2017). In addition,
SEESD serves approximately 500 students who are identified as homeless under the McKinney-
Vento Homeless Education Act.
Over the last four years, SEESD has changed the infrastructure of their schools by hiring
eight school social workers and partnering with more than 10 community mental health agencies
and universities to address the social and emotional needs of their students. SEESD's most
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 18
supportive and influential partner has been the University of Social Emotional Support (USES).
Through their collaboration, SEESD has become more than just a field placement provider for
USES, they have become a Teaching Institute supervising more than 50 master-level USES
interns. SEESD values their interns and has invested on providing them with an enriching
learning experience in order to service the families and children of the SEESD community. With
new additions to the SEESD team, the Student Support Services department aims to reinforce
preventative interventions to reduce disciplinary incidents, increase school's sense of safety and
support, in turn enhancing academic outcomes. SEESD would particularly like to measure the
effectiveness of the Positive Behavioral Interventions and Supports (PBIS, 2017b) framework.
PBIS is an intervention intended to improve the climate of schools using system-wide positive
behavioral interventions to discourage disruptive behaviors by analyzing the reduction of school-
and district-wide behavioral incidents and increases of positive school climate across the district
(Christofferson & Callahan, 2015).
Organizational Performance Need
In order to fulfill its mission to raise the academic achievement bar and close learning
gaps for all students, SEESD’s leadership considered it is important for the district to continue to
implement Positive Behavioral Interventions and Supports (PBIS, 2017b) and create an
evaluation system that would measure the effectiveness of the PBIS (2017b) framework. If the
Student Support Services department does not create an evaluation tool that measures its
effectiveness, they will not be able to hold teachers, school site social workers, nor district-
administrators accountable for output or outcomes of PBIS interventions. As a result, the
SEESD will not be able to reinforce the goals reflected on their mission statement; thus,
supporting a culture of decision-making based on anecdotes and/or opinions rather than
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 19
providing concrete evidence. In essence, the SEESD needs an evaluation system for compliance.
Currently, no such system exists.
Related Literature
Positive Behavioral Interventions and Supports (PBIS, 2017a) is a multi-tiered prevention
framework that aims to promote a positive school climate and prevent disruptive behavior
problems by applying three prevention tiers to organize effective social skills instruction and
behavioral interventions (Freeman, Miller, & Newcomer, 2015; McIntosh, Mercer, Nese,
Strickland-Cohen, & Hoselton, 2016). The first prevention tier is the primary support which is
designed to prevent initial occurrences of problem behavior. This universal level of support
benefits approximately 80% of all students in a school (Hoyle et al., 2011). The second
prevention tier provides more intensive interventions to support students who do not respond to
primary prevention strategies. This level of support benefits approximately 15% of students in a
school (Hoyle et al., 2011). The third prevention tier provides intensive individual interventions
to support students who exhibit serious behavior problems. This level of support benefits
approximately 5% of students in a school (Hoyle et al., 2011). Essentially, the goal of PBIS is to
create a positive school environment so that discipline problems decrease and student academic
skills improve (Cohen, Kincaid, & Childs, 2007). Christofferson and Callahan (2015) supported
school safety and explained that creating an environment where students experience a sense of
belonging, including feeling protected and accepted, fosters school connectedness. When
students feel connected to their environment, it ensures their success at school and beyond.
School is the ideal environment to implement universal interventions aimed to promote
protective factors that foster resilience and positive emotional development. Existing literature
supported that the implementation of PBIS (2017a) positively impacts academic achievement,
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 20
student-teacher interactions, and classroom instructional time. PBIS also decreases office
discipline referrals, suspensions, and expulsions; and improves school climate and students’
social and emotional competencies (McIntosh et al., 2016). Overall, PBIS has had a tremendous
impact on educational outcomes for all children and youth, particularly students with special
needs/education and/or mental health concerns. Schools across the nation have been
implementing school- and district-wide systems of support encouraged by the United States
Congress due to the historic exclusion of students with disabilities based on unaddressed
behavior and the evidence base that supports the use of PBIS (2017a). Evidently, the emphasis
of using functional assessment and positive approaches to encourage good behavior in schools
has made a great impact across the country that it is specifically mentioned in the Individuals
with Disabilities Education Act (IDEA) amended in 2004 (PBIS, 2017a).
The implementation structure and demonstrated impact of PBIS (2017a) provides a
strong foundation and structure to increase the capacity of its efforts to assist students with
higher-level needs. The prevention framework has had a strong influence on education that it
has been implemented nationwide in over 19,000 schools (Cohen et al., 2007; PBIS, 2017a,
2017b). As implementation of PBIS took place, George and Childs (2012) recommended school
districts evaluate the effects PBIS approaches have had on student performance. In addition, it
was also recommended for school districts to develop and document comprehensive evaluations
to determine progress and evidence of improved student outcomes, particularly behavioral
outcomes (Cohen et al., 2007; George & Childs, 2012). A comprehensive evaluation plan
essentially influences and determines future support for PBIS, addresses replication,
sustainability, and continuous improvement (George & Childs, 2012).
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 21
Recent implications regarding PBIS (2017b) include measurement tools to assess fidelity
and monitoring of implementation. This is due to the absence of expedient, effective assessment
tools (Cohen et al., 2007; George & Childs, 2012). Essentially, there is a need for accountability
when implementing PBIS interventions; however, many school districts are not evaluating
effectiveness which helps maintain a safe school environment (Hoyle et al., 2011). This creates
implications for school environments since violence has been rated by the public as the top
problem of concern in schools (Hoyle et al., 2011).
Importance of the Organizational Innovation
It is important for the SEESD to implement an evaluation system to monitor the
effectiveness of PBIS (2017b) and maintain compliance addressing the school district’s mission
and focus areas. If the SEESD is not compliant, the sustainability of PBIS may be at risk. In
other words, if PBIS is not supported with concrete data, it will impact the school district’s
decision-making when allocating resources. Therefore, the possibility to improve or expand its
approaches may not be possible. To avoid future risks, the SEESD needs to design and
implement an evaluation system to monitor the effectiveness of PBIS and maintain compliance
in order to positively and successfully achieve its mission and focus areas.
Organizational Performance Goal
As such, SEESD established the goal that by June 2018, SEESD will design and
implement an evaluation system to monitor the effectiveness of all PBIS (2017b) interventions to
ensure 100% compliance addressing its mission to raise the bar and close learning gaps for all
students, and to ensure a positive school climate across the district. The Student Support
Services Director at SEESD established this goal after an initial meeting with district-level
administrators and School Site Social Workers highlighting the lack of accountability from the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 22
department. The achievement of the Student Support Services department will be measured by
the results of school-wide office referrals compared to the previous academic school year.
Table 1
Stakeholders and Stakeholders’ Performance Goals
Organizational Mission
The mission of the Social Emotional Elementary School District (SEESD) is to raise the bar
and close learning gaps for all students. SEESD invests in an optimal learning environment
that enhances safety and creates positive school climates so that all students are behaviorally
and academically engaged in school.
Organizational Performance Goal
By June 2018, the SEESD will design and implement a comprehensive evaluation system that
will monitor and ensure 100% compliance of PBIS interventions addressing its mission.
District-level
Administrators
By June 2018, the SEESD
administrators allocated and
provided resources for the
PBIS committee to conduct
an assessment of PBIS.
PBIS Committee
By August 2018, the PBIS
committee comprised of
school administrators, school
site social workers, and
teacher representatives
evaluated the implementation
of PBIS at each school site
and made intervention
recommendations for each.
Classroom Teachers
By September 2018, 100% of
the SEESD staff had
knowledge of PBIS
intervention gaps which would
increase attitudes and
behaviors for performance in
the 2017-2018 academic
school year.
Critical Behaviors
Kirkpatrick and Kirkpatrick (2016) suggested that organizations identify a few critical
behaviors the key stakeholder group must perform for the organization to achieve its goal. For
this study, the key stakeholder group, the PBIS committee, would need to perform the following
critical behaviors to achieve its performance goal:
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 23
a) create a comprehensive evaluation system;
b) implement the evaluation system; and
c) analyze the data produced by the evaluation system.
Stakeholder for the Study and Stakeholder Performance Gap
While the joint efforts of all stakeholders will contribute to the achievement of the overall
organizational goal of designing and implementing an evaluation system that will monitor and
ensure 100% compliance of PBIS (2017b) interventions, it is important to understand the needs
of the PBIS committee members as they attempt to design and implement the evaluation plans
that align with the SEESD’s mission. Therefore, the stakeholders of focus for this study will be
the SEESD PBIS committee members. The stakeholders’ goal, supported by the Director of
Student Support Services and district-level administrators, was that 100% of the PBIS committee
will assess and design a comprehensive evaluation system that will measure the effectiveness of
PBIS at the SEESD. The evaluation system will measure replication, sustainability, and
improvements. The SEESD’s goal is to ensure 100% compliance of PBIS interventions aimed to
address its mission. The gap in performance, therefore, is 100%.
Purpose of the Project and Questions
The purpose of this project was to conduct a needs assessment in the areas of knowledge
and skill, motivation, and organizational resources necessary to reach the organizational
performance goal. The assessment began by generating a list of possible needs and then moved
to examining these systematically to focus on actual or validated needs. While a complete needs
assessment would focus on all stakeholders, for practical purposes the stakeholder to be focused
on in this assessment was the SEESD’s PBIS committee members.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 24
As such, the questions that guided this study were the following:
1. What are the knowledge, motivation, and organizational needs necessary for the
SEESD’s PBIS committee members to achieve their goal of designing and implementing
a comprehensive evaluation system that will monitor and ensure 100% compliance of
PBIS interventions addressing its mission?
2. What are the recommended knowledge, motivation, and organizational solutions to those
needs?
Methodological Framework
Clark and Estes’ (2008) gap analysis, a systematic, analytical method that helps to clarify
organizational performance goals and identify the gap between the actual performance level and
the preferred performance level within an organization, was adapted for needs’ analysis.
Assumed knowledge, motivation, and organizational needs were generated based on identified
critical behaviors and related literature. These needs were validated by using surveys, focus
groups and interviews, literature review, and content analysis. Research-based solutions were
recommended and evaluated in a comprehensive manner.
Definitions
● Fidelity of Implementation: is the determination of how well an intervention is
implemented in comparison with the original program design during an efficacy and/or
effectiveness study (O'Donnell, 2008).
● Local Control Accountability Plan (LCAP): is a three-year funding plan focused on state
and local priorities that describes goals, actions, services, and expenditures to support
student achievement (Los Angeles County Office of Education, 2017b).
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 25
● Office Discipline Referrals (ODR): a measure or metric of student behavior and the
incidence of violence and aggression in schools useful for guiding refinements in
program interventions and making inferences about intervention effects (Sprague et al.,
2007).
● Positive Behavioral Interventions and Supports (PBIS): is a proactive system-wide
framework for creating and maintaining safe and effective learning environments in
schools, and ensuring that all students have the social skills needed to ensure their
success at school and beyond (Los Angeles County Office of Education, 2017a).
● Response to Intervention (RtI): is a tiered approach addressing all students within a
school by providing the appropriate intensity of academic support necessary for
educational progress (Freeman et al., 2015).
● School Climate: the shared beliefs, values, and attitudes that shape the interactions
between students, teachers, and administrators and set the parameters of acceptable
behavior and norms of the school (Christofferson & Callahan, 2015).
● School Safety: a basic need to every human being, and should include but is not limited
to emotional and physical safety. Feeling secure while at school greatly promotes the
performance and learning of students. This ensures good relationships and promotes
students’ physical and mental health (Christofferson & Callahan, 2015).
● Tier 2 Interventions: Tier 2 behavioral interventions provide students with frequent and
structured opportunities to practice school-wide expectations with specific feedback.
Three types of Tier 2 interventions are included at this level: (a) targeted interventions,
(b) simple group interventions, and (c) individualized interventions for minor problem
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 26
behavior. Each can be adapted to address the functional consequences of both
externalizing and internalizing problem behavior (Newcomer, Freeman, & Barrett, 2013).
Organization of the Proposal
Three chapters were used to organize this proposal. This chapter included the reader with
key concepts and terminology commonly found in a discussion about designing and
implementing a comprehensive evaluation system to monitor and ensure compliance of PBIS
(2017b) interventions. The organization’s mission, goals, and stakeholders, as well as the initial
concepts of gap analysis, adapted to needs analysis were introduced. A review of current
literature surrounding the scope of the study are included in Chapter Two. Topics of school
safety, school climate, systems of support and interventions, policy, and funding will be
addressed. Chapter Three provides details of the assumed needs for this study as well as
methodology when it comes to choice of participants, data collection, and analysis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 27
CHAPTER TWO: REVIEW OF THE LITERATURE
Chapter Two begins with an overview of Positive Behavioral Interventions and Supports
(PBIS, 2017b) and includes an outline of the role of the PBIS committee when designing and
implementing a comprehensive evaluation system that will monitor and ensure compliance of
such interventions. Then, an examination of each critical behavior described in Chapter One for
the stakeholder group of focus, and the PBIS committee who are designing and implementing a
comprehensive evaluation system from the knowledge, motivation, organization (KMO)
perspective will be presented.
Positive Behavioral Interventions and Supports (PBIS, 2017a) is an approach that
promotes effective, efficient, and equitable learning environments that enhance academic and
social behavior outcomes for all students. School districts implementing PBIS (2017b) have
started to develop and document comprehensive evaluation plans to determine progress and
evidence of improved student outcomes, particularly behavioral outcomes (George & Childs,
2012). Evaluation has expanded to address questions related to replication, sustainability, and
continuous improvement. McIntosh et al. (2016) stated that a school district’s support is needed
to sustain implementation that extends beyond an individual school; therefore, it is recommended
for districts to create a leadership team or committee that could design and implement a
comprehensive evaluation system. Essentially, the leadership team or committee could monitor
and ensure compliance of PBIS interventions at all schools.
Once there is an established and committed leadership team or committee, Newcomer et
al. (2013) suggested an intervention coordinator should monitor the progress data of each school
to guide decisions to continue, intensify, or decrease interventions. The intervention coordinator
and leadership team or committee could review the overall effectiveness of the interventions
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 28
provided and revise the intervention practices to create a comprehensive evaluation plan
(Newcomer et al., 2013). A comprehensive evaluation plan could help identify and examine
potential systemic issues that may confirm lack of readiness and commitment from participating
schools or identify areas for increased assistance (Freeman et al., 2015; George & Childs, 2012;
Newcomer et al., 2013). Ultimately, collecting data across a school district supports behavior
and discipline efforts, and helps analyze student performance.
Conceptual Framework
Clark and Estes (2008) recommended that an organization identify causes of performance
gaps using three critical factors: (a) knowledge and skills, (b) motivation to achieve the goal, and
(c) organizational barriers. The purpose of this gap analysis is to identify whether all employees
have adequate knowledge, motivation, and organizational support to achieve desired goals. This
study used Clark and Estes’ (2008) gap analysis to determine the needs of the performance gap.
In this case, the performance gap is 100%, and the goal of the gap analysis is to identify what the
organization needs to put in place to successfully create a comprehensive evaluation plan.
Stakeholder Knowledge, Motivation, and Organizational Factors
Anderson and Krathwohl’s (2001) organized knowledge into four distinct subcategories.
These subcategories help create learning objectives that target specific types of knowledge:
factual knowledge, conceptual knowledge, procedural knowledge, and metacognitive knowledge.
To determine and clarify assumed knowledge causes, an analysis of each knowledge type will be
studied in order to create a comprehensive evaluation plan.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 29
Assumed Knowledge Influences
Declarative Factual Knowledge
To perform their critical behaviors, the PBIS committee needs to know the established
goal to evaluate the PBIS (2017b) interventions, the components of PBIS, and the factors that
comprise an effective evaluation system that can monitor and ensure compliance of PBIS
interventions.
Anderson and Krathwohl (2001) defined declarative factual knowledge as the basic
element individuals must know to be acquainted with a particular discipline in order to problem
solve identified gaps in education. Declarative factual knowledge is fundamental because it
forms the foundation for higher levels of knowledge. Knowledge of major ideas and mastery of
a particular subject matter are key elements of factual knowledge (Anderson & Krathwohl,
2001). Given the definition of declarative factual knowledge, each factual knowledge need is
examined next.
PBIS committee members need to know the established goal to evaluate compliance
with PBIS interventions. Without an established goal, stated purpose, clear expectations
backed up by specific rules, and procedures for evaluation, compliance with PBIS interventions
are compromised (Lewis & Sugai, 1999). Cohen et al. (2007) indicated that one reason for the
lack of implementation measures of PBIS (2017b) interventions may be the scarcity of
implementation assessment instruments. For instance, one validated tool to evaluate the
implementation of PBIS is the School-Wide Evaluation Tool, a 28-item research-based
observation and interview instrument. This implementation assessment instrument requires
clearly defined expectations so that students and staff could recall or recognize information that
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 30
was learned. Once there are established goals, feedback about the implementation process could
be provided to help schools and districts improve their programs (Cohen et al., 2007).
PBIS committee members need to know the components of PBIS. As described by
Freeman et al. (2015), district leadership teams need to work together to create a comprehensive
plan that lists key components of PBIS (2017b). Gaining knowledge of these components helps
participating schools and districts align district policies with PBIS, coordinate training, and
evaluate the effectiveness of implementation efforts (Freeman et al., 2015). Research by
Christofferson and Callahan (2015) indicated key components of PBIS include: (a) active
teaching and reinforcement of a small number of clearly defined social-behavioral expectations,
(b) consistent implementation of consequences for violations of school expectations, and (c) use
of data to drive intervention planning. Once PBIS committee members know the components of
PBIS, they will be able to recognize and recall information learned in order to create or enhance
an effective evaluation system.
PBIS committee members need to know what comprises an effective evaluation
system that can monitor and ensure compliance of PBIS interventions. In order to monitor
and ensure PBIS interventions are implemented district-wide, PBIS committee members must
learn how to evaluate practices that have had an effect on student performance and school
climate within an established evaluation plan (George & Childs, 2012). George and Childs
(2012) recommended establishing a strong foundation in order to effectively evaluate a plan and
assess implementation fidelity across schools. Once PBIS committee members develop a strong
foundation and decide on an assessment method, improvements could be made that could lead to
activities the PBIS committee could conduct (e.g., additional training, staff changes, increased
technical assistance, improved communication from the Superintendent) to monitor and ensure
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 31
compliance (George & Childs, 2012, p. 203; Horner et al., 2013). Additionally, the PBIS
committee members could monitor progress data to guide decisions of whether to continue,
intensify, or fade interventions. Moreover, PBIS committee members could review the overall
effectiveness of PBIS interventions as a whole to track the percent of students participating and
demonstrating improvements (Newcomer et al., 2013). In essence, data can result in revisions to
intervention practices (e.g., procedures, intensity, and/or monitoring implementation integrity)
that could benefit the program’s success.
Declarative Conceptual Knowledge
To perform their critical behaviors, the PBIS committee needs to know various categories
of PBIS (2017b) to measure improvements and/or areas of growth, the three-tiered PBIS model,
and the system of PBIS. Anderson and Krathwohl (2001) defined declarative conceptual
knowledge as the interrelationship among the basic elements within a larger structure that
enables them to function together. Declarative conceptual knowledge is fundamental because it
provides knowledge of classifications and categories, essential principles, and structures in order
to construct meaning from various forms of communication. Essentially, this dimension
combines the ability to retrieve relevant knowledge in order to summarize and construct meaning
(Anderson & Krathwohl, 2001). Given the definition of declarative conceptual knowledge, the
PBIS committee will be examined for further application of the dimension that enables transfer
of knowledge across domains.
PBIS committee members need to know the various categories of PBIS to measure
improvements and/or areas of growth. Even though PBIS (2017a) is unified in its approach to
support students both academically and behaviorally, PBIS offers various categories that could
be measured (Coffey & Horner, 2012). These categories range from student grade level behavior
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 32
expectations, responding to behavior violations, evidence-based interventions, management,
support, or evaluation (Cohen et al., 2007). In addition, fidelity of implementation is an
important category to consider. Without a measure of fidelity of implementation in an effective
study, “researchers may not be able to account for negative or ambiguous findings, nor determine
whether unsuccessful outcomes are due to an ineffective program or due to failure to implement
the program and its conceptual and methodological underpinnings as intended“ (O'Donnell,
2008, p. 42). In essence, identifying categories to measure a program leads to further
improvements for identified areas of growth.
PBIS committee members need to know the three-tiered PBIS model. In order for
PBIS (2017b) to successfully be implemented district-wide, committee members need to
understand the core features of each tier. School-wide PBIS combines primary, secondary, and
tertiary prevention interventions. As described by Hoyle et al. (2011), Tier 1 is the primary
intervention where students receive the same information regarding behavioral expectations,
positive and negative examples, opportunities to practice, and data collection to make informed
decisions to create action plans. Tier 2 is the secondary intervention that is targeted for students
at heightened risks for behavior problems and creating specialized group systems to support
identified students. Lastly, Tier 3 is the tertiary intervention that provides intense individualized
systems for students with high risk behavior. All three tiers teach and reinforce appropriate
behaviors that prevent behavior problems using approaches to develop an orderly and efficient
school-wide environment (Hoyle et al., 2011). In sum, the PBIS committee members need to
know what each tier entails in order to develop an appropriate evaluation system for each tier.
PBIS committee members need to know the system of PBIS. Learning how PBIS
(2017b) functions encourages committee members to engage in a systemic support that extends
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 33
beyond an individual school (PBIS, 2017b). Comprehending the PBIS system and its key
elements contribute to the creation of a sustainable structure. When developing a comprehensive
and sustainable structure, PBIS includes school-wide procedures and processes intended for all
students and all staff in all settings. Once behavioral expectations are defined and taught,
students can be rewarded and staff members can develop an on-going reward system or a system
for responding to behavioral violations (Cohen et al., 2007). Understanding how systems work
can lead to monitoring and decision-making.
This level of understanding can assist the PBIS committee manage systemic problems
and future planning. Above all, the school district can evaluate the effectiveness of this approach
and measure student performance within an established evaluation plan (George & Childs,
2012). Not only is it important to assess the level of implementation across the critical elements
at a school level, but it is equally important to assess the data across many schools within a
district. George and Childs (2012) stated that when a district has committed resources to address
behavioral support and tackle discipline in hopes to improve student performance, “evaluation
data at the district level are critical” (p. 202). Evaluation allows a committee to examine
potential systemic issues that may or may not have been successful. In essence, a review of
intervention effectiveness could result in revisions to the approach or monitoring implementation
integrity; hence, modifying procedures, adding or changing staff, enhancing training, or
improving communications (Newcomer et al., 2013).
Procedural Knowledge
To perform their critical behaviors, the PBIS committee needs to know how to create an
evaluation system, how to implement and analyze the results of an evaluation system, and how to
incorporate PBIS (2017b) interventions into the schools. Anderson and Krathwohl (2001)
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 34
defined declarative procedural knowledge as methods of inquiry, and criteria for using skills,
techniques, and methods. Declarative procedural knowledge is important to examine because it
provides knowledge of subject-specific skills to determine when to use appropriate procedures.
This dimension not only combines the ability to retrieve information in order to construct
meaning, it also includes the knowledge of the criteria and conditions under which the
procedures should be followed (Anderson & Krathwohl, 2001). Given the definition of
declarative procedural knowledge, the PBIS committee will be examined for further application
of the dimension that enables them to solve problems within a particular area or subject-specific
criteria that will help determine which method, technique, or procedure to apply when creating
an evaluation system (Anderson & Krathwohl, 2001).
PBIS committee members need to know how to create an evaluation system. In
order for PBIS to be successfully implemented district-wide, PBIS committee members need to
develop a strong foundation at each school site that fosters an environment of belonging, feeling
safe and accepted, and connected (Christofferson & Callahan, 2015). Part of feeling safe and
connected to the school environment is when there are shared beliefs, values, and attitudes that
shape the interactions between students, teachers, and administrators (Christofferson & Callahan,
2015). Feeling secure while at school promotes healthy relationships and students’ physical and
mental health within a school setting. Essentially, this type of environment sets the parameters
for acceptable behavior and norms which are integral elements that help create an evaluation
system.
In addition to creating a positive school environment, the PBIS committee needs to create
an infrastructure that aligns with state, district, and school policies and goals (Freeman et al.,
2015). Once a strong infrastructure has been produced based on shared policies and goals, the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 35
PBIS committee could engage in fabricating formative and summative evaluation questions that
measure various elements such as participation, fidelity, effectiveness, and improvement of
intervention practices (Newcomer et al., 2013). These processes could help create an evaluation
system that is comprehensive and measures overall effectiveness.
PBIS committee members need to know how to implement an evaluation system. As
previously stated, PBIS committee members need to develop a strong foundation where an
infrastructure for all levels of coordination can function. Barrett, Bradshaw, and Lewis-Palmer
(2008) suggested PBIS coordination teams work together to create a comprehensive plan that
confirms readiness and commitment from participating districts and schools. Freeman et al.
(2015) recommended PBIS committee members provide ongoing training to all participants in
order to universalize norms and expectations. Once there is a solid foundation, a commitment
from the district and/or school, and ongoing training occurs, the PBIS committee could evaluate
concepts taught by the team. As a result, future evaluation processes would be set to be more
efficient and effective.
PBIS committee members need to know how to analyze the results of the PBIS
evaluation system. Since there are different types of evaluation tools, Freeman et al. (2015)
suggested PBIS committee members organize, review, share, and monitor data sources using
technology programs. Accessing data from a technology program can help committee members
focus on target outcomes and analyze evaluation results. In addition to organizing and storing
data, it is also recommended for PBIS committee members to train participants on the delivery of
the PBIS (2017b) approach. Ziomek-Daigle, Goodman-Scott, Cavin, and Donohue (2016)
reported that ongoing training helps establish expectations that can be evaluated once
interventions are implemented. After implementing interventions, PBIS committee members
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 36
can measure the impact of their interventions on the desired outcomes, thus collectively
determining their effectiveness and impact.
PBIS committee members need to be able to incorporate PBIS interventions into the
schools. Without PBIS (2017b) interventions, the design and implementation of a
comprehensive evaluation system is nonexistent. Newcomer et al. (2013) recommended that
PBIS committee members teach and monitor program effectiveness to determine if modifications
are needed in a particular approach. In essence, reviewing intervention effectiveness could result
in revisions to intervention practices (e.g., procedures, intensity, and/or monitoring
implementation integrity) hence, making judgements about the basic value of the implementation
efforts (Newcomer et al., 2013).
Metacognitive Knowledge
To perform their critical behaviors, the PBIS committee needs to know how to reflect on
their own abilities and evaluate their own performance when creating their own comprehensive
evaluation system. Anderson and Krathwohl (2001) defined metacognitive knowledge as
knowledge about cognition in general, as well as knowledge about one’s own cognitive
processes and the cognitive strategies that are required for particular tasks. In addition,
metacognitive awareness allows individuals to appropriately adapt the ways in which they think
and operate. Essentially, this dimension is characterized by interpersonal skills needed to reflect,
evaluate, and problem solve (Anderson & Krathwohl, 2001). Given the definition of
metacognitive knowledge, the PBIS committee will be examined for further application of the
dimension that enables them to collaborate as a team when creating an evaluation system.
PBIS committee members need to know how to reflect on their own abilities to
create their own comprehensive evaluation system. When individuals reflect on their own
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 37
abilities, they are aware of practices and approaches that have had positive effects on student
performance (George & Childs, 2012). This level of experience and awareness was reported to
be important when creating a comprehensive evaluation system that addresses implementation
fidelity across a school site and across a school district (George & Childs, 2012). Having the
ability to examine systemic issues upon creating an evaluation system allows a team to avoid
potential pitfalls. It also allows a team to reflect and think of additional types of data to
thoroughly evaluate a comprehensive evaluation plan (Freeman et al., 2015; George & Childs,
2012).
PBIS committee members need to reflect on and evaluate their own performance.
Team reflection and evaluation are important when a district has committed resources to address
behavioral support and discipline to improve student performance (George & Childs, 2012).
Therefore, reporting school functioning at each tier is critical when implementing interventions
across the years (Freeman et al., 2015). In essence, reporting implementation findings requires
evaluation and reflection on own performance thus, holding individuals accountable for their
involvement in creating a comprehensive evaluation plan.
Table 2
Summary of Assumed Knowledge Influences on Stakeholders’ Ability to Achieve the
Performance Goal
Assumed Knowledge Influences
Research Literature
Author, Year; Author, Year.
Declarative Factual
PBIS committee members need to know the
established goal to evaluate compliance of PBIS
interventions.
Cohen, Kincaid, & Childs, 2007
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 38
Table 2 (Cont’d.)
Assumed Knowledge Influences
Research Literature
Author, Year; Author, Year.
PBIS committee members need to know the
components of PBIS.
Christofferson & Callahan, 2015
Freeman, Miller, & Newcomer,
2015
PBIS committee members need to know what
comprises an effective evaluation system that
can monitor and ensure compliance of PBIS
interventions.
Coffey & Horner, 2012
Freeman, Miller, & Newcomer,
2015
George & Childs, 2012
Horner et al., 2013
Newcomer, Freeman, & Barrett, 2013
Declarative Conceptual
PBIS committee members need to know the
various categories of PBIS to measure
improvements and/or areas of growth.
Coffey & Horner, 2012
Cohen, Kincaid, & Childs, 2007
Freeman, Miller, & Newcomer, 2015
George & Childs, 2012
O'Donnell, 2008
PBIS committee members need to know the
three-tiered PBIS model.
Positive Behavioral Interventions and
Supports, 2017a
Positive Behavioral Interventions and
Supports, 2017b
Newcomer, Freeman, & Barrett, 2013
PBIS committee members need to know the
system of PBIS.
Positive Behavioral Interventions and
Supports, 2017a
Positive Behavioral Interventions
and Supports, 2017b
Cohen, Kincaid, & Childs, 2007
George & Childs, 2012
Newcomer, Freeman, & Barrett, 2013
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 39
Table 2 (Cont’d.)
Assumed Knowledge Influences
Research Literature
Author, Year; Author, Year.
Procedural
PBIS committee members need to know how to
create an evaluation system.
Christofferson & Callahan, 2015
Freeman, Miller, & Newcomer, 2015
Newcomer, Freeman, & Barrett, 2013
PBIS committee members need to know how to
implement an evaluation system.
Barrett, Bradshaw, &
Lewis-Palmer, 2008
Freeman, Miller, & Newcomer, 2015
PBIS committee members need to know how to
analyze the results of the PBIS evaluation
system.
Freeman, Miller, & Newcomer, 2015
Ziomek-Daigle, Goodman-Scott,
Cavin, & Donohue, 2016
PBIS committee members need to be able to
incorporate PBIS interventions into the schools.
Newcomer, Freeman, & Barrett, 2013
Metacognitive
PBIS committee members need to know how to
reflect on their own abilities to create their own
comprehensive evaluation system.
George & Childs, 2012
Freeman, Miller, & Newcomer, 2015
PBIS committee members need to reflect on and
evaluate their own performance.
George & Childs, 2012
Freeman, Miller, & Newcomer, 2015
Assumed Motivation Influences
Motivation is an essential component for performance that complements knowledge.
Motivation determines the extent to which people choose to exert mental effort, thus impacting
employee performance. This section presents a review of literature on constructs related to
motivation in the workplace. Particularly, Clark and Estes’ (2008) three motivational “indexes
or types of motivational facets that play an important role in workplace performance. These
processes include active choice, persistence, and mental effort. Active choice in the workplace
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 40
refers to making a decision when faced with more than one possibility and actively pursuing to
accomplish a particular goal related to work. Persistence refers to consistency and perseverance
in achieving a goal despite distractions in the workplace. Lastly, mental effort refers to the
amount of time and energy one decides to invest in achieving a particular goal. All of these
facets, combined with effective knowledge and skills can result in increased performance, thus
achieving goals in the workplace (Clark & Estes, 2008).
In addition, confidence and emotions play an important role with beliefs about
effectiveness. As stated by Clark and Estes (2008), our confidence is a measure of our belief in
our own abilities. Some of our strong emotions are the product of our reasoning and experience
about how effective we have been and will continue to be in our work environment (p. 95).
Therefore, the concept of value is essential because it helps determine the amount of persistence
and commitment an established goal will obtain. Self-efficacy is another concept that plays an
important role in motivating and influencing individuals. An individual's mood and beliefs about
their capability to achieve a goal has a lot to do with the value they place on it along with their
own perceptions about the task at hand. The three assumed motivation influences that will be
examined below are value, self-efficacy, and mood.
Value. Clark and Estes (2008) presented three types of values that motivate individuals
to accomplish an established goal. The first type of value they present is interest value which is
described as an individual's intrinsic interest in mastering a new skill or goal (Clark & Estes,
2008). The second type of value is skill value which is described as individual’s showcasing
their abilities, suggesting they are good at certain kinds of tasks. The third type of value is utility
value which is described as individuals focusing on the benefits that come when a goal is met
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 41
(Clark & Estes, 2008). Given this information, the PBIS committee will be examined for further
application of the values they place on their work.
PBIS committee members need to value designing and implementing a
comprehensive evaluation system to monitor and ensure compliance of PBIS interventions.
When creating a PBIS school district, the environment at each school must be defined by shared
beliefs, attitudes, and values. These shared values shape the interactions students, teachers, and
administrators have with one another. Values also set parameters for acceptable behavior and
norms at each level (Christofferson & Callahan, 2015). When designing and implementing a
comprehensive evaluation system to monitor and ensure compliance of PBIS interventions, the
PBIS committee members need to value the importance of assessing an environment where
students experience a sense of belonging (Christofferson & Callahan, 2015). Feeling safe and
accepted are indicators that PBIS interventions are successfully implemented. This type of
connectedness is essential when creating an evaluation system because it creates connections and
establishes agreements based on people’s values, increases work commitment, thus achieving
established goals that ensure compliance of PBIS interventions (Clark & Estes, 2008; Freeman et
al., 2015).
Self-efficacy. Self-efficacy is defined as one’s perceived judgment about his or her
ability to execute actions necessary to achieve a specific goal or level of performance (Bandura,
1997). It is also a strong determinant and predictor of the level of accomplishment individuals
will attain; therefore, it is important for individuals to engage in tasks they feel confident
executing (Pajares, 1996). When individuals do not feel competent in their abilities, they are
likely not to accomplish a task or goal. For this reason, Bandura (1997) and Pajares (1996)
suggested self-efficacy beliefs should be assessed at an optimal level of specificity that
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 42
corresponds to the critical task or goal being examined, and the domain of functioning being
analyzed.
PBIS committee members need to have confidence that they can create an
evaluation system that will monitor and ensure compliance of PBIS interventions. In order
to build the confidence of the PBIS committee members, self-efficacy should be assessed to
identify or predict self-beliefs and performances at varying levels. As Coffey and Horner (2012)
indicated, educational leaders cannot “experiment” on students with practices that have not been
proven effective. Therefore, students need to be provided with the best possible chance to
succeed by receiving supports that have an evidence base. Although the content of evidence-
based practices is critical, it is insufficient to ensure success. However, how the innovation is
executed (i.e., its creation) is an underemphasized component to transforming an effective
innovation into its intended outcome (Coffey & Horner, 2012). By offering the best practices,
coupled with the freedom to innovate, builds certainty and a sense of ownership (Newcomer et
al., 2013). As a result, an individual's self-efficacy is sustained by the confidence that is built
throughout the process, thus ensuring compliance of agreed upon goals.
Mood. Creating an environment where students experience a sense of belonging,
including feeling safe and accepted is integral in maintaining and fostering school
connectedness. School connectedness has been associated with a positive school climate and
overall improved behavior. It has also been defined as feeling connected to peers, teachers, and
staff at school; a sense of enjoyment and liking of school; a belief that school is important; active
engagement in school activities; and a perceived sense of belonging, closeness, and commitment
to school (Christofferson & Callahan, 2015, p. 38). For this reason, feeling secure while at
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 43
school greatly promotes the performance and learning of students. This, in return, ensures good
relationships and promotes students mental health (Christofferson & Callahan, 2015).
PBIS committee members need to feel positive about designing and implementing a
comprehensive evaluation system. Knowing that the interventions designed by the PBIS
committee positively impact school climate and school connectedness district-wide generate
feelings of satisfaction, and value in the work the committee is creating. In addition, the
execution of a comprehensive evaluation system allows the committee to continuously assess,
modify, and implement additional services or supports that benefit the students and the district at
large (Horner et al., 2013). In essence, motivating the PBIS committee in a collaborative effort
to create a comprehensive evaluation system.
Table 3
Summary of Assumed Motivation Influences on Stakeholders’ Ability to Achieve the Performance
Goal
Assumed Motivation Influences
Research Literature
Author, Year; Author, Year.
Value
PBIS committee members need to value
designing and implementing a comprehensive
evaluation system to monitor and ensure
compliance of PBIS interventions.
Christofferson & Callahan, 2015
Clark & Estes, 2008
Freeman, Miller, & Newcomer, 2015
Self-Efficacy
PBIS committee members need to have
confidence that they can create an evaluation
system that will monitor and ensure compliance
of PBIS interventions.
Bandura, 1997
Coffey & Horner, 2012
Newcomer, Freeman, & Barrett, 2013
Pajares, 1996
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 44
Table 3 (Cont’d.)
Assumed Motivation Influences
Research Literature
Author, Year; Author, Year.
Mood
PBIS committee members need to feel positive
about designing and implementing a
comprehensive evaluation system.
Christofferson & Callahan, 2015,
Horner et al., 2013
Assumed Organization Influences
Individuals within an organization may possess the knowledge, skills, and motivation
required to accomplish an established goal; however, inadequate resources, bureaucracies, and
structures may prevent a goal from being achieved (Clark & Estes, 2008). Organizational
barriers can also create problems with knowledge, skills, and motivation within an organization
(Rueda, 2011). The culture of the organization can also determine how well individuals and
teams perform to accomplish a goal (Clark & Estes, 2008). In this section literature on critical
organizational factors is reviewed.
Resources
PBIS committee members need time to create an effective comprehensive evaluation
system. Organizations require time, finances, and employees to achieve goals. In order for
PBIS committee members to create an effective and comprehensive evaluation system, time is
the most essential resource, particularly when a school district has schools on varying stages of
implementation (McIntosh et al., 2016). Some school districts introduce PBIS (2017b) without
providing time to plan, execute, or monitor the implementation of PBIS. Newcomer et al. (2013)
suggested school districts hire an intervention coordinator who could effectively monitor and
track the implementation of PBIS interventions at each school, and guide decisions on whether to
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 45
continue, intensify, or fade approaches. This level of execution requires time; therefore, an
intervention coordinator is needed to review the overall effectiveness of the interventions and
determine if aspects of the approaches need revising.
PBIS committee members need individuals to commit to design, implement, and
monitor an effective comprehensive evaluation system. Without the human capacity, district
leadership would not be able to work together to create a comprehensive evaluation plan to
confirm annual readiness and commitment of participating schools, coordinating behavioral
training, aligning district policies with PBIS, and evaluating the effectiveness of implementation
efforts (Freeman et al., 2015). District support is also important to sustain implementation and
commitment from committee members (McIntosh et al., 2016). Without district support, PBIS
(2017b) interventions would not be able to impact as many students as intended. Given this
information, it is important for school districts to support PBIS interventions and its processes
within a district.
Policies, Processes, and Procedures
PBIS committee members need to have policies that align with the goal of the school
district. When policies align with the goals of the school district, processes flow even when
there are issues that arise. There are universal rules that are established and defined when
implementing PBIS (2017b) interventions; they set the foundation for behavior expectations
(Cohen et al., 2007). When behavior expectations or rules are not followed, PBIS offers
additional supports for students with ongoing discipline concerns (Barrett et al., 2008). Schools
may offer ongoing systems to reward expected behaviors, systems to respond to behavioral
violations, and managing/monitoring decision-making supports that allow students to thrive in
school (Cohen et al., 2007). In addition, policies, processes, and procedures enable the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 46
evaluation process to measure effectiveness of well-established goals and commonly used
language (Newcomer et al., 2013). The PBIS (2017b) website supports this claim and indicated
that an evaluation process improves the efficiency of resource use, implementation efforts, and
organizational management. Therefore, a system for ongoing evaluation is needed to enhance
future approaches of PBIS.
Cultural Setting
Clark and Estes (2008) defined culture as a way to describe the core values, goals,
emotions, beliefs, and processes learned as individuals develop over time. PBIS committee
members need to be part of a culture that supports PBIS interventions that align with the mission
of investing in optimal learning environments to enhance safety and create positive school
climates. When an organization is responsible for supporting its mission, it is likely for their
employees to be invested in achieving or enhancing its goals (Barrett et al., 2008). Therefore, it
is important for PBIS committee members to create a comprehensive plan that will allow them to
assess and confirm annual readiness and commitment of participating schools. The committee
will collectively coordinate trainings and evaluate the effectiveness of implementation efforts
(Freeman et al., 2015). As a unit, the PBIS committee will define a culture with a common
vision that values socioemotional learning and positive school climates. Essentially, a systemic
support that extends beyond an individual school is needed for successful implementation.
Overall, a systematic and supportive culture is necessary to implement an effective and
comprehensive evaluation plan. It is essential for PBIS committee members to be part of an
environment that values and prioritizes the evaluation of implementation efforts to ensure
success (Freeman et al., 2015). When a team places value on evaluation capacity, they are
regularly assessing whether interventions are being implemented with fidelity and measuring
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 47
whether interventions are benefiting students (Horner et al., 2013). In all, establishing an
evaluation infrastructure defines a key foundation for larger measures of PBIS (2017b)
implementation in the future.
Table 4
Summary of Assumed Organization Influences on Stakeholders’ Ability to Achieve the
Performance Goal
Assumed Organization Influences
Research Literature
Author, Year; Author, Year.
Resources (time; finances; people)
PBIS committee members need time to create an
effective comprehensive evaluation system.
McIntosh et al., 2016
Newcomer, Freeman, & Barrett, 2013
PBIS committee members need individuals to
commit to design, implement, and monitor an
effective comprehensive evaluation system.
Freeman, Miller, & Newcomer, 2015
McIntosh et al., 2016
Policies, Processes, & Procedures
PBIS committee members need to have policies
that align with the goal of the SEESD.
Barrett, Bradshaw, &
Lewis-Palmer, 2008
Cohen, Kincaid, & Childs, 2007
Newcomer, Freeman, & Barrett, 2013
Positive Behavioral Interventions and
Supports (2017b)
Culture
PBIS committee members need to be part of a
culture that supports PBIS interventions that
align with the mission of investing in optimal
learning environments that enhance safety and
creates positive school climates.
Clark and Estes, 2008
Barrett, Bradshaw, &
Lewis-Palmer, 2008
Freeman, Miller, & Newcomer, 2015
PBIS committee members need to be part of a
culture that values monitoring and
accountability, ensuring that PBIS interventions
are being implemented.
Freeman, Miller, & Newcomer, 2015
Horner et al., 2013
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 48
Summary
This literature review addressed the need for efficient and effective organizational work
resources, processes and procedures, and culture that supports the achievement of established
goals. Organizational dynamics are essential because missing processes or inadequate materials
are often the barriers to the achievement of performance goals, even for individuals with top
motivation and exceptional knowledge and skills (Clark & Estes, 2008). Building capacity and
internal accountability develops a team’s commitment because it places value on the intended
goals. Therefore, considerations to requirements in the areas of knowledge and skills,
motivation, and organizational factors were presented in order to effectively achieve the goal of
creating a comprehensive evaluation system.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 49
CHAPTER THREE: METHODOLOGY
The purpose of this project was to conduct a needs assessment in the areas of knowledge
and skill, motivation, and organizational resources necessary to reach the organizational
performance goal. The assessment will begin by generating a list of possible needs and will then
move to examining these systematically to focus on actual or validated needs. While a complete
needs assessment would focus on all stakeholders, for practical purposes, the stakeholder to be
focused on in this assessment is the SEESD’s PBIS committee members.
The questions that guided this gap analysis are the following:
1. What are the knowledge, motivation, and organizational needs necessary for the
SEESD’s PBIS committee members to achieve their goal of designing and implementing
a comprehensive evaluation system that will monitor and ensure 100% compliance of
PBIS interventions addressing its mission.
2. What are the recommended knowledge, motivation, and organizational solutions to those
needs?
Methodological Framework
The framework for the methodology in this study is the Gap Analysis Model (Clark &
Estes, 2008; Rueda, 2011). The Gap Analysis Model provides research-based guidelines to
identify causes of performance gaps and select the appropriate solutions (Clark & Estes, 2008).
The gap analysis identifies organizational goals, analyzes the organization’s performance, and
identifies and validates the causes of performance gaps, not relying solely on assumed causes of
gaps in order to formulate recommendations and solutions that will result in performance
improvement and goal achievement (Clark & Estes, 2008; Rueda, 2011).
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 50
The gap analysis used both qualitative and quantitative methods. Qualitative
methodology leads one to discover how people feel, think, and act in a setting or situation
(Patton, 2002). This approach clarifies the human causes behind the performance gaps (Clark &
Estes, 2008). Comparatively, it examines qualitative data to validate and understand the gaps in
knowledge and skill, motivation, and organizational resources identified through quantitative
methods. Essentially, the gap analysis used a systematic process to examine knowledge and
skill, motivation, and organizational resources that contribute to the organization’s performance
gap and creates measures to monitor the effectiveness of the recommended solutions (Rueda,
2011). The steps in the gap analysis process are shown in Figure 1.
Assessment of Performance Influences
Clark and Estes (2008) recommended that an organization identify causes of performance
gaps using three critical factors: (a) knowledge and skills, (b) motivation to achieve the goal, and
(c) organizational barriers. The purpose of this gap analysis was to identify whether all
employees have adequate knowledge, motivation, and organizational support to achieve desired
goals. As previously stated, this study used Clark and Estes’ (2008) gap analysis to determine
the needs of the performance gap. In this case, the performance gap is 100%, and the goal of the
gap analysis is to identify what the organization needs to put in place to successfully create a
comprehensive evaluation plan.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 51
Source: Clark, R. E. & Estes, F. (2008). Turning research into results. Charlotte, NC: Information Age Publishing.
Figure 1. Gap Analysis Process
Knowledge Assessment
Anderson and Krathwohl’s (2001) organized knowledge into four distinct subcategories.
These subcategories help create learning objectives that target specific types of knowledge:
declarative conceptual knowledge, declarative procedural knowledge, and metacognitive
knowledge. To determine and clarify knowledge influences, an analysis of each knowledge type
will be conducted to create a comprehensive evaluation plan.
Declarative factual knowledge assessment. To perform their critical behaviors, the
PBIS committee needs to know the established goal to evaluate compliance of PBIS (2017b)
interventions, the components of PBIS, and the factors that comprise an effective evaluation
system that can monitor and ensure compliance of PBIS interventions.
Anderson and Krathwohl (2001) recommended using inquiry methods that require
participants to demonstrate specific knowledge to assess an individual’s declarative factual
knowledge. This requires knowledge of basic facts, information, and terminology related to
PBIS (2017b). PBIS committee members completed a questionnaire relating to PBIS at the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 52
beginning of a planning meeting. This preliminary questionnaire required participants to
complete multiple-choice questions and recognition tasks to gauge their understanding. PBIS
committee members also completed interviews where they were asked questions that required
them to demonstrate knowledge of basic facts and information related to the goal, components of
PBIS, and measurement tools that evaluate PBIS interventions. In addition, a document analysis
was completed to review artifacts for evidence of knowledge of facts and information. Table 5
provides an overview of the methods that will be used, along with the sample survey, interview
items, and documents that will be analyzed.
Table 5
Summary of Knowledge Influences and Method of Assessment
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
Declarative Factual
PBIS committee members need to
know the established goal to
evaluate compliance of PBIS
interventions.
Multiple choice. Complete the
sentence.
OR
Which statement below is true?
SEESD’s goal for the evaluation
of PBIS is to…
a) Create an evaluation system
that measures compliance and
effectiveness of PBIS
interventions.
b) Build on the evaluation system
that is already in place.
c) Recreate an evaluation system
that measures compliance and
effectiveness of PBIS
interventions.
d) Merge evaluation systems into
one system that measures
compliance and effectiveness of
PBIS interventions.
Tell me what SEESD’s
goal is for the
evaluation of PBIS.
Review artifacts (visual
of established goals or
procedures)
Questions to consider:
- Do they have PBIS
training
material/documents?
- Do they reference forms
from attended trainings?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 53
Table 5 (Cont’d.)
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
PBIS committee members need to
know the components of PBIS.
Multiple choice.
Identify what the PBIS
components are…
a) Establish expectations, teach
expectations, reinforce
expectations, correct behavior
b) Intensive, Individual
Interventions, Targeted Group
Interventions, Universal
Interventions
c) Training, coaching, evaluating
d) Individual student systems,
Classroom setting systems,
School-wide systems, District-
level systems
In your own words, tell
me the components of
PBIS.
Review artifacts (visual
of PBIS components/
systems)
Questions to consider:
- Do they have PBIS
training material/
documents?
- Do they reference forms
from attended trainings?
PBIS committee members need to
know what comprises an effective
evaluation system that can monitor
and ensure compliance of PBIS
interventions.
Multiple choice.
An effective evaluation system is
comprised of...
a) Fidelity of implementation
practices, Team self-assessments,
Walk-through observation tools,
Performance evaluations
b) PowerSchool student
information system, Team self-
assessments, Walk-through
observation tools, Performance
evaluations
c) PowerSchool student
information system, School-wide
Information System (SWIS),
Walk-through observation tools,
Performance evaluations
d) PowerSchool student
information system, Attendance
data, Suspension data, Expulsion
data, Mental Health service data,
What are some
measurement tools that
evaluate PBIS
interventions?
Review artifacts (training
material/ documents)
Questions to consider:
- Do they have PBIS
training
material/documents?
- Do they reference forms
from attended trainings?
- Do they have
information of evaluation
tools, such as
PowerSchool student
information system,
Attendance data,
Suspension data,
Expulsion data, Mental
Health service data,
School-wide Information
System (SWIS), Self
Assessment Survey
(SAS), Tiered Fidelity
Inventory (TFI)
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 54
Table 5 (Cont’d.)
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
School-wide Information System
(SWIS), Self Assessment Survey
(SAS), Tiered Fidelity Inventory
(TFI)
Declarative
Conceptual
PBIS committee members need to
know various categories of PBIS
to measure improvements and/or
areas of growth.
Multiple choice. Complete the
sentence.
____________ are the four
categories to consider when
measuring areas of growth.
a) Fidelity of implementation
practices, Team self-assessments,
Walk-through observation tools,
Performance evaluations
b) PowerSchool student
information system, Team self-
assessments, Walk-through
observation tools, Performance
evaluations
c) PowerSchool student
information system, School-wide
Information System (SWIS),
Walk-through observation tools,
Performance evaluations
d) PowerSchool student
information system, School-wide
Information System (SWIS), Self
Assessment Survey (SAS), Tiered
Fidelity Inventory (TFI)
There are four
categories to consider
when measuring
improvement of PBIS
interventions and/or
areas of growth...
1.
2.
3.
4.
Please provide an
example of these
categories and how
they each affect the
outcome.
PBIS committee members need to
know the three-tiers of PBIS
supports.
Multiple choice. Choose the best
option to complete the sentence.
PBIS focuses on creating and
sustaining…
a) Tier 1 (universal interventions
for all students; Tier 2 (targeted
group support for some students);
and Tier 3 (individual support for
a few students)
Please explain the
relationship between
the three-tiers of
support.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 55
Table 5 (Cont’d.)
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
b) Individual students and targeted
groups
c) Academic systems and
behavioral systems
d) Intervention strategies
PBIS committee members need to
know the system of PBIS.
Multiple choice. Choose the best
option to complete the sentence.
Foundational school-wide systems
feature…
a) PBIS implementation that is
clear and a priority to the district
b) Resources that support
implementation
c) Alignment and integration of
PBIS with other district priorities,
needs, and initiatives
d) All of the above
Please provide
examples of essential
features that are
needed when
implementing a
school-wide PBIS
system. How do these
features influence the
implementation of
PBIS ?
Procedural
PBIS committee members need to
know how to create an evaluation
system.
Multiple choice. Choose the best
answer to complete the sentence.
You are the Director of Student
Support Services, there is an
increase on suspension rates and
expulsions rates district-wide
despite implementing PBIS.
You should:
a) engage in fabricating formative
evaluation questions that measure
various elements of PBIS
intervention practices
b) engage in fabricating
summative evaluation questions
that measure various elements of
PBIS intervention practices
Walk me through the
steps you use to ensure
PBIS interventions are
being implemented.
Review artifacts (training
material, documents,
charts)
Questions to consider:
- Do they have PBIS
training
material/documents?
- Do they reference PBIS
forms?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 56
Table 5 (Cont’d.)
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
c) engage in fabricating formative
and summative evaluation
questions that measure elements
of PBIS intervention practices
d) ensure there is uniformity
across the district and engage in
fabricating formative and
summative evaluation questions
that measure various elements of
PBIS intervention practices
PBIS committee members need to
know how to implement an
evaluation system.
Multiple choice. Complete the
sentence and choose all that apply
to your evaluation approach.
I evaluate when...
a) I need to ensure there is a
strong foundation before I expand
or develop
b) I need to assess for readiness
and commitment from those
participating
c) I need to determine if ongoing
training is needed to improve
efficiency and effectiveness.
d) All of the above
When and how do you
determine an
evaluation system is
needed?
PBIS committee members need to
know how to analyze the results of
the PBIS evaluation system.
Multiple choice. Complete the
sentence and choose all that apply
when analyzing evaluation results.
When I analyze results I. . .
a) organize and store the data
b) organize, review, share, and
monitor the data
c) review, share, and monitor the
data
d) share and monitor the data
Walk me through the
process of obtaining
PBIS evaluation results
and determining if
progress was made.
Review artifacts (PBIS
evaluation material/
documents)
Questions to consider:
- Do they have PBIS
training or meeting
documents?
- Do they reference forms
from attended meetings?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 57
Table 5 (Cont’d.)
Assumed Knowledge
Influences
Survey Item
Interview Item
Document
Analysis
PBIS committee members need to
be able to incorporate PBIS
interventions into the schools.
Multiple choice. Choose the best
answer to complete the sentence.
Intervention effectiveness could
result in…
a) revisions to intervention
practices, such as procedures,
intensity, and/or monitoring
implementation integrity.
b) designing a new intervention
c) implementing a new program
d) changing the structure of the
program
Please explain how
data results influence
the way PBIS
interventions are
incorporated into the
schools.
OR
Please give an example
of how you use data
results to influence the
way PBIS
interventions are
incorporated into the
schools.
Metacognitive
PBIS committee members need to
know how to reflect on their own
abilities to create their own
comprehensive evaluation system.
Multiple choice. Complete the
sentence.
I self-evaluate by…
a) regularly monitoring my
progress towards my goals.
b) thinking through alternatives
before determining my answer.
c) motivating myself
d) All of the above.
How do you evaluate
the effectiveness of
your own abilities to
create a comprehensive
evaluation system?
PBIS committee members need to
reflect on and evaluate their own
performance.
Multiple choice. Complete the
sentence.
We (as a group) evaluate our own
performance by . . .
a) providing feedback
b) discussing what is working and
what is not working
c) making adjustments.
d) All of the above.
How do you evaluate
the effectiveness of the
committee’s evaluation
system?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 58
Declarative conceptual knowledge assessment. Anderson and Krathwohl (2001)
recommended using inquiry methods that require participants to demonstrate interrelationships
among basic elements with a larger structure that enables them to function together. This
requires knowledge of underlying categories, principles, or structures of PBIS (2017b). In order
to validate, PBIS committee members will be required to identify, classify, or categorize
principles and concepts relating to PBIS. The questionnaire required participants to complete
multiple-choice questions and categorize items to select the best option to complete a sentence.
In addition, PBIS committee members completed interviews where they were asked to
paraphrase, give examples, or summarize concepts in their own words. Table 5 provided an
overview of the methods that will be used, along with the sample survey and interview items.
Procedural knowledge assessment. Anderson and Krathwohl (2001) recommend using
inquiry methods that require participants to demonstrate how to do something. This requires
knowledge of skills and procedures involved with a task, including techniques, methods of
inquiry, and criteria for using skills, techniques, and methods. In order to validate, PBIS
committee members will be required to demonstrate the ability to apply knowledge and show
they can implement or execute a task relating to PBIS. Survey items will require participants to
select the best option by completing multiple-choice questions in order to complete a sentence.
PBIS committee members will also complete interviews where they will be asked to articulate or
demonstrate the steps necessary to perform the task. In addition, a document analysis will be
completed to review artifacts for evidence of the necessary methods, techniques or steps.
Table 5 provided an overview of the methods that were used, as well as the sample survey and
interview items.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 59
Metacognitive knowledge assessment. Anderson and Krathwohl (2001) recommended
using inquiry methods that require participants to demonstrate knowledge of cognition as well as
awareness and knowledge of one’s own cognition. This requires the ability to reflect on and
adjust necessary skills and knowledge including general strategies, assessing demands, planning
one’s approach, and monitoring progress. In order to validate, PBIS committee members were
required to demonstrate metacognitive analysis before, during, and/or after a PBIS related task.
Open-ended items required participants to select the best option by completing multiple-choice
questions in order to complete a sentence. In addition, PBIS committee members were also
presented with open-ended questions that requires participants to reflect and demonstrate
metacognitive knowledge. Table 5 provided an overview of the methods that were used, along
with the sample survey and interview items.
Motivation Assessment
Motivation is what drives individuals to act or behave in a particular manner; it is what
regulates how much effort one is willing to invest in a goal (Clark & Estes, 2008). Clark and
Estes (2008) presented three facets of motivated performance: active choice, persistence, and
mental effort. Active choice takes place when an individual decides to pursue a goal; this does
not include the intention to start. Once the individual has chosen a particular goal, problems with
persistence can threaten an individual’s determination to work towards that goal. Choosing a
goal and persisting to accomplish it must be combined with mental effort (Clark & Estes, 2008;
Pintrich, 2003).
In addition, the confidence an individual has toward reaching the desired goal is an
essential component of how much work and energy the individual decides to apply to reach the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 60
goal. To assess and determine motivational influences, an analysis of each motivation category
were studied in order to create a comprehensive evaluation plan.
Value assessment. Clark and Estes (2008) presented three types of values that motivate
individuals to accomplish an established goal. The first type of value they presented was interest
value which is described as an individual's intrinsic interest in mastering a new skill or goal
(Clark & Estes, 2008). The second type of value was skill value which is described as
individual’s showcasing their abilities, suggesting they are good at certain kinds of tasks (Clark
& Estes, 2008). The third type of value was utility value which is described as individuals
focusing on the benefits that come when a goal is met (Clark & Estes, 2008). Given this
information, participants were asked to list several tasks and indicate which is more valuable for
them; this gauged their level of value on the intended goal. PBIS committee members also
completed interviews where they were asked open-ended questions that required them to reflect
and think of reasons. Lastly, an observation analysis was completed to observe behaviors to
assess level of engagement (choice, persistence, effort) in the task. Table 6 provides an overview
of the methods that were used to assess, along with the sample survey, interview items, and
observation analysis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 61
Table 6
Summary of Motivation Influences and Method of Assessment
Assumed Motivation
Influences
Survey Item
Interview Item
Observation
Analysis
Value
PBIS committee members need to
value designing and implementing
a comprehensive evaluation
system to monitor and ensure
compliance of PBIS interventions.
Put these sentences in order
of your value.
___ designing an evaluation
plan for PBIS interventions.
___ implementing an
evaluation plan for PBIS
interventions.
___ monitoring progress of
PBIS interventions.
___ determining
effectiveness of PBIS
interventions.
OR
Put these sentences in order
of your value.
___ I enjoy PBIS and its
interventions to improve
student behavior.
___ Learning how to create
an evaluation plan for PBIS
interventions is
valuable/useful for me in
terms of my future goals.
___ As a PBIS committee
member, it is important for
me to learn how to design,
implement, monitor, and
determine effectiveness of
PBIS interventions.
___ Being involved in the
PBIS committee is worth it
to me even if it takes more
time than expected.
How valuable is the
design and
implementation of PBIS
interventions ?
Could you discuss some
of your reasons why we
should design or
implement an evaluation
system?
Observe
behaviors to
assess level of
engagement
(choice,
persistence,
effort) in the task.
Research
demonstrates that
value, for
example, is
particularly
predictive of
choice to engage.
NOTE: Choice,
persistence and
effort are only
“indicators” of
motivational
problems. Several
underlying
motivational
factors may be at
work so it is
important to
consider all of the
motivational
categories and
triangulate
observation with
targeted survey
items or interview
questions as
suggested.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 62
Table 6 (Cont’d.)
Assumed Motivation
Influences
Survey Item Interview Item Observation
Analysis
Self-Efficacy
PBIS committee members need to
have confidence that they can
create an evaluation system that
will monitor and ensure
compliance of PBIS interventions.
Rate your degree of
confidence in doing the
following by indicating
strongly agree to strongly
disagree using the scale
below:
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
I can influence students,
parents, and administrators
to comply with PBIS
interventions.
If I plan accordingly, I can
meet deadlines
I am confident in my ability
to design an evaluation plan
for PBIS
I am confident in my ability
to monitor PBIS
interventions once they have
been implemented.
I am confident in my ability
to assess effectiveness of
PBIS interventions.
To what degree do you
feel confident about
your ability to create an
evaluation system that
will monitor and ensure
compliance of PBIS
interventions?
Tell me about a time
when you felt confident
creating an evaluation
system?
What impacts your
confidence? How
consistent is it?
Observe
behaviors for
evidence of the
role of self-
efficacy.
Observe
participants
behavior:
Do they choose to
engage?
Do they persist?
Do they invest
mental effort?
Mood
PBIS committee members need to
feel positive about designing and
implementing a comprehensive
evaluation system.
Rate your degree of
positivity in doing the
following by indicating
strongly agree to strongly
disagree using the scale
below:
Describe how you feel
about designing,
implementing,
monitoring, and
determining
effectiveness of PBIS
interventions.
Observe
behaviors for
evidence of
positivity.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 63
Table 6 (Cont’d.)
Assumed Motivation
Influences
Survey Item Interview Item Observation
Analysis
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
I feel positive about
designing a comprehensive
evaluation system.
I feel positive when
implementing a
comprehensive evaluation
system.
I feel positive when
monitoring a comprehensive
evaluation system.
I feel positive when
determining effectiveness of
PBIS interventions.
In the face of challenges
and setbacks, do you
feel positive about
engaging in the creation
of a comprehensive
evaluation system?
Tell me how you feel
about designing a
comprehensive
evaluation system for
PBIS?
Tell me how you feel
about implementing a
comprehensive
evaluation system for
PBIS?
Tell me how you feel
about monitoring a
comprehensive
evaluation system for
PBIS?
Tell me how you feel
about determining
effectiveness of a
program?
Self-efficacy assessment. Motivation, learning, and performance are enhanced when
learners have positive expectations for success (Bandura, 1997; Clark & Estes, 2008). Self-
efficacy is a strong determinant and predictor of the level of accomplishment individuals will
attain. In order to validate, PBIS committee members were asked to engage in tasks they felt
confident executing. Survey items required participants to rate their degree of confidence using
a five-point item Likert scale. PBIS committee members also completed interviews where they
were asked open-ended questions that required reflection and reasoning. Lastly, an observation
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 64
analysis was completed to observe behaviors and assess self-efficacy. Table 6 provided an
overview of the methods that were used to assess, along with the sample survey, interview items,
and observation analysis.
Mood assessment. Christofferson and Callahan (2015) recommended using inquiry
methods that require participants to share their experiences. With this in mind, PBIS committee
members were asked to engage in tasks they felt positive executing. Survey items required
participants to rate their degree of positivity using a five-point item Likert scale. PBIS
committee members also completed interviews where they were asked open-ended questions that
required reflection and reasoning. Lastly, an observation analysis was completed to observe
behaviors and assess positivity. Table 6 provided an overview of the methods that were used to
assess, along with the sample survey, interview items, and observation analysis.
Organization, Culture, Context Assessment
Individuals within an organization may possess the knowledge, skills, and motivation
required to accomplish an established goal; however, inadequate resources, bureaucracies, and
structures may prevent a goal from being achieved (Clark & Estes, 2008). Organizational
barriers can also create problems with knowledge, skills, and motivation within an organization
(Rueda, 2011). The culture of the organization can also determine how well individuals and
teams perform to accomplish a goal (Clark & Estes, 2008). Assumed performance indices
related to resources, policies, processes and procedures, cultural setting, and cultural model
discussed in Chapter Two were assessed. The specific assessment procedures described were
guided by literature related to motivation, assessment, and inquiry. Table 7 provides an
overview of the methods that were used to assess, along with the sample survey, interview items,
and observation analysis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 65
Assessing resources. Organizations require time, finances, and employees to achieve
goals. In order for PBIS committee members to create an effective and comprehensive
evaluation system, time is the most essential resource, particularly when a school district has
schools at varying stages of implementation (McIntosh et al., 2016). In addition, human capacity
is another resource needed to create a comprehensive evaluation plan to confirm annual
readiness and commitment of participating schools, coordinating behavioral training, aligning
district policies with PBIS (2017b), and evaluating the effectiveness of implementation efforts
(Freeman et al., 2015). Given this information, it is important to validate these statements using
inquiry methods to assess the district's resources.
PBIS committee members completed a questionnaire where they were asked to indicate
the extent to which a particular item was present at their school sites. Survey items required
participants to rate the resources provided to them using a five-point item Likert scale. PBIS
committee members also completed interviews where they were asked open-ended questions to
gauge their knowledge of information and understanding. Lastly, an observation analysis was
completed to observe behaviors and assess resources. Table 7 provides an overview of the
methods that will be used to assess, along with the sample survey, interview items, and
observation analysis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 66
Table 7
Summary of Organization Influences and Method of Assessment
Assumed Organization
Influences
Survey Item
Interview Item
Observation
Analysis
Resources (time; finances;
people)
PBIS committee members need time
to create an effective comprehensive
evaluation system.
Please indicate the extent
to which the item is
present at your school site
using the scale below:
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
My school site provides
planning time to design,
implement, and monitor
compliance of PBIS
interventions.
What kind of planning
time does your school
allocate for PBIS? How
much time? How do you
utilize this time?
Observe behavior
(satisfaction,
frustration)
PBIS committee members need
individuals to commit to design,
implement, and monitor an
effective comprehensive evaluation
system.
Please indicate the extent
to which the item is
present at your school site
using the scale below:
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
I receive support from the
administrative staff at my
school site.
OR
The administrative team at
my school site supports
my efforts.
What type of support has
your school provided to
design, implement, and
monitor PBIS?
If things are not going
well with the design,
implementation, and
monitoring of PBIS
interventions, how would
the school administrators
react?
Observe behavior
(satisfaction,
frustration, level of
commitment),
meeting, retrieve
meeting notes
Policies, Processes, &
Procedures
PBIS committee members need to
have policies that align with the
goal of the SEESD.
Please indicate the extent
to which the item is
present at your school
district using the scale
below:.
To what extent do your
district’s policies align
with SEESD’s goal?
Obtain district
policies, processes
and procedures
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 67
Table 7 (Cont’d.)
Assumed Organization
Influences
Survey Item Interview Item Observation
Analysis
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
The district's policies
align with SEESD’s PBIS
implemented
interventions.
What extent do the
district’s policies align
with SEESD’s expected
level of implementation?
Obtain district
policies, processes
and procedures
Culture
PBIS committee members need to
be part of a culture that supports
PBIS interventions that align with
the mission of investing in optimal
learning environments that enhance
safety and create positive school
climates.
Please indicate the extent
to which the item is
present at your school site
using the scale below:
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
The district's mission of
investing in optimal
learning environments that
enhance safety and creates
positive school climates
aligns with PBIS
interventions at my school
site.
The district values and
recognizes schools that
apply PBIS interventions.
My school site values and
recognizes those who
apply PBIS interventions.
How does your school
create and demonstrate a
culture of PBIS?
Illustrate what the school
culture is like.
Observe meetings,
retrieve meeting
notes, observe
offices,
classrooms, and the
entire school.
PBIS committee members need to
be part of culture that values
monitoring and accountability,
ensuring that PBIS interventions
are being implemented.
Please indicate the extent
to which the item is
present at your school site
using the scale below:
● Strongly agree
● Agree
● Neutral
● Disagree
● Strong disagree
.
Show me how the district
ensures PBIS
interventions are being
implemented at you
school site.
How does your school
site ensure compliance of
PBIS?
Observe meetings,
retrieve meeting
notes, observe
offices,
classrooms, and the
entire school.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 68
Table 7 (Cont’d.)
Assumed Organization
Influences
Survey Item Interview Item Observation
Analysis
The district monitors and
holds schools accountable
for the implementation of
PBIS interventions.
My school site monitors
and holds the entire school
accountable for the
implementation of PBIS
interventions
Assessing policies, processes, and procedures. When policies and procedures align
with the goals of the school district, processes flow even when there are issues that arise. There
are universal rules that are established and defined when implementing PBIS interventions; they
set the foundation for behavior expectations (Cohen et al., 2007). When behavior expectations
or rules are not followed, PBIS offers additional supports for students with ongoing discipline
concerns (Barrett et al., 2008). Schools may offer ongoing systems to reward expected
behaviors, systems to respond to behavioral violations, and managing/monitoring decision-
making supports that allow students to thrive in school (Cohen et al., 2007). In addition,
policies, processes, and procedures enable the evaluation process to measure effectiveness of
well established goals and commonly used language (Newcomer et al., 2013). Given this
information, it is important to validate these statements using inquiry methods to assess the
district's policies, processes, and procedures.
PBIS committee members completed a questionnaire where they were asked to indicate
the extent to which a particular item was present at their school site. Survey items required
participants to rate whether the district's policies align with SEESD’s PBIS implemented
interventions using a five-point item Likert scale. PBIS committee members also completed
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 69
interviews where they were asked open-ended questions to gauge their knowledge of information
and understanding. In addition, a document analysis was completed to review artifacts for
evidence of district policies, processes, and procedures. Table 7 provided an overview of the
methods that were used to assess, along with the sample survey, interview items, and documents
that were analyzed.
Assessing cultural setting. When an organization is responsible for supporting its
mission, it is likely for their employees to be invested in achieving or enhancing its goals (Barrett
et al., 2008). Therefore, it is important for PBIS committee members to create a comprehensive
plan that will allow them to assess and confirm annual readiness and commitment of
participating schools. As a unit, the PBIS committee could define a culture with a common
vision that values socioemotional learning and positive school climates. In order to validate,
PBIS committee members were required to measure visible, concrete manifestations of cultural
models that appear within activity settings (Gallimore & Goldenberg, 2001).
PBIS committee members completed a questionnaire where they were asked to indicate
the extent to which a particular item was present at their respective school site. Survey items
requires participants to rate whether the district's mission aligns with PBIS interventions using a
five-point item Likert scale. PBIS committee members also completed interviews where they
were asked open-ended questions regarding the culture of PBIS at each school site. Lastly, an
observation analysis was completed to observe behaviors during meetings, classrooms, and
throughout the school. Table 7 provided an overview of the methods that were used, along with
the sample survey, interview items, and observation analysis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 70
Participating Stakeholders and Sample Selection
In this study, the key stakeholder group is the PBIS committee who will perform critical
behaviors to achieve its performance goal. The PBIS committee will be comprised of eight
school administrators, eight school site social workers, and eight teacher representatives; three
representatives from eight schools (n=24). This group was selected in order to have
representation of all schools across the district. The PBIS committee members’ roles were
instrumental to the development of PBIS, together they will create a comprehensive evaluation
system, implement the evaluation system, and analyze the data produced by the evaluation
system.
Patton (2002) suggested both quantitative and qualitative methods are essential to make a
study strong. The use of triangulation of multiple data sources also strengthens the study,
therefore, surveys and interviews were used to complete the research study.
Surveys
Surveys are information collection methods used to describe, compare, or explain
individual and societal knowledge, feelings values, preferences, and behavior (Fink, 2016). This
study utilized surveys to assess Knowledge and Skills (K), Motivation (M), and Organizational
Factors (O) related to competencies. One to three items were developed for each of the three
levels of the Clark and Estes’ (2008) framework. The survey contained 20 items ranging from
multiple choice, sentence completion, identification, and rating. PBIS committee members were
recruited via email. Each participant received an individual invitation that included a link to the
survey. The survey was administered using the research software Qualtrics. Please refer to
instrument in Appendix A. Qualtrics required respondents to answer all survey items before
moving on to the next group of items. Respondents were only allowed to submit the survey
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 71
once. In all, the Qualtrics software maintained the respondents’ confidentiality as no identifying
information data were collected.
Interviews
In order to triangulate data, a semi-structured, open-ended interview protocol was
designed to further assess Knowledge and Skills (K), Motivation (M), and Organizational
Factors (O) related to competencies. One to three items were developed for each of the three
levels of the Clark and Estes’ (2008) framework. Participants were recruited when the survey
link was emailed. In the body of the email, a link to schedule an in-person interview was
provided. The interview protocol contained 28 items, each interview was approximately one
hour in length. See Appendix B for a list of the 28 interview questions. Participants selected the
best date and time for their interview. Each interview was conducted individually, after school.
Interview responses were kept confidential and used only for analysis. In all, this sampling
approach provided rich, descriptive data that contributed to the descriptive nature of qualitative
research (Merriam & Tisdell, 2016).
Observation checklist and document analysis design. Observations for a number of
behaviors were conducted to further assess motivation and organization assumed causes. The
observations assessed engagement in tasks, evidence of self-efficacy and positivity, and
satisfaction. In addition, a document analysis was conducted to further assess knowledge of
facts, information, and terminology. The analysis reviewed artifacts, documents, meeting notes,
charts, and evaluation tools as evidence of knowledge of PBIS. A printed copy of the research
questions was kept with the investigator in order to focus their attention during the observation.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 72
Data Collection
Following University of Southern California Institutional Review Board (IRB) approval,
participants were solicited via email to participate in surveys and interviews. All
communications regarding survey and interviews came directly from the investigator.
Surveys
The investigator sent a direct email link to the web-based survey to all participants. The
web-based survey was hosted by Qualtrics, a research software. The investigator sent reminder
emails on a weekly basis. The survey was available for two to three weeks after the initial email.
Interviews
The investigator sent an email for participants to schedule an in-person interview at their
respective school site. The interview lasted approximately one hour and took place after school
hours. Interviews were held in a quiet, private, and ample room where the interviewee and
investigator could have an open discussion without interruption, and without compromising
confidentiality. With the interviewees’ consent, the interviews were audio recorded.
Observations
Observation data was collected in (overt) and out (covert) of the office or classroom
setting (i.e., structured activities in the office or classroom and unstructured activities in the
playground or other areas of the school campus). The interviews lasted approximately one hour
in and out of the office or classroom, and assessed the administrator, social worker, and teachers’
engagement in tasks, self-efficacy, positivity, and satisfaction. In addition, the investigator
assessed knowledge of facts, information, and terminology.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 73
Document Analysis
An analysis was conducted to review artifacts, documents, meeting notes, charts, and
evaluation tools as evidence of knowledge. Observations and interviews allowed for follow-up
questions to be made regarding particular artifacts that may be able to support in the data
collection process. The investigator took approximately one hour reviewing artifacts from each
participant (if applicable).
Data Analysis
The unit of analysis for this inquiry project is the Social Emotional Elementary School
District. The purpose of this inquiry project suggested a mixed-methods approach. The
strategies that were used to analyze survey and interview data are explained in this section.
Descriptive statistics were used to summarize the survey data. Descriptive statistics explain
information regarding frequencies or frequency of distributions, measure of central tendency,
and measures of variation (Fink, 2016). The qualitative data was collected through semi-
structured interviews, the text of the transcripts were coded using symbols that represented the
categories of knowledge and skills, motivation, and organization to capture and analyze relevant
information and guided the identification of causes with additional detail (Merriam & Tisdell,
2016). Merriam and Tisdell (2016) also supported the use of open, axial, and selective codes in
order to organize qualitative data. In all, the results of the data collected were analyzed and are
presented in Chapter Four.
Trustworthiness of Data
Ensuring validity and reliability in qualitative research involved conducting the
investigation in a credible and trustworthy manner that took ethical concerns into consideration
(Merriam & Tisdell, 2016). For instance, a strategy that could be used to increase the credibility
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 74
of research findings was to use triangulation (i.e., observations and interviews) to confirm
emerging findings. Once observations and interviews were completed, transcribing the findings
and soliciting feedback from the respondents ensured validation (Maxwell, 2013). This approach
helped interpret findings by simply asking if they were plausible. Lastly, the investigator
ensured credibility and trustworthiness of findings by critically analyzing and highlighting their
position as a researcher (i.e., reflexivity); hence, self-reflecting on own assumptions, biases, and
perspectives on the subject of PBIS (Merriam & Tisdell, 2016).
Role of Investigator
The investigator in this study is an employee of the Social Emotional Elementary School
District. To protect against bias, an external investigator completed the interviews or transcribed
them. Although this added extra costs and time, it protected against individual bias during data
analysis. It is important to note that the participants are not subordinates to the investigator. In
addition, the investigator has no direct role in evaluating performance or administering punitive
actions upon the participants.
Limitations
The design intended to yield rich data and recommendations for the SEESD; however,
more time would be needed to interview and observe more individuals and schools. This would
have allowed further analysis, comparisons, and interpretations to be made in order to make
strong assertions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 75
CHAPTER FOUR: RESULTS AND FINDINGS
The purpose of this inquiry was to examine the needs of SEESD with regard to the design
and implementation of a comprehensive evaluation system that will monitor and ensure
compliance of PBIS interventions addressing its mission. A needs assessment was performed in
the areas of knowledge and skill, motivation, and organizational resources necessary to perform
the critical behaviors to reach the organizational performance goal of designing and
implementing a comprehensive evaluation system that will monitor and ensure 100% compliance
of PBIS interventions addressing its mission. The assessment generated a list of assumed causes
that were delineated in Chapter Three and were examined systematically to determine and clarify
influences in each area. A mixed-method approach was used to capture data that identified and
validated perceived causes of performance gaps relevant to the SEESD’s PBIS committee
reported in Chapter Four. Specifically, surveys, interviews, and document analysis data were
collected to understand the knowledge, motivation, and organization challenges PBIS committee
members encountered when designing and implementing a comprehensive evaluation system.
Potential solutions to address the validated causes of the gaps will be presented and discussed in
Chapter Five.
Participating Stakeholders
During the 2016-2017 academic school year, 345 certificated staff members (i.e.,
principals, assistant principals, school social workers, and teachers) employed by the SEESD
were solicited to complete surveys and interviews. Of the 345 solicited certificated staff
members, 77 SEESD certificated staff members (22%) completed the survey. Of the 77
certificated staff members who completed the survey, six were principals, five were assistant
principals, 10 were school social workers, and 58 were school teachers. Of the 345 solicited
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 76
certificated staff members, 24 SEESD certificated staff members (14%) participated in the
qualitative portion of the study. Of the 24 certificated staff members who participated in the
interviews, eight were principals, four were assistant principals, eight were school social
workers, and four were teachers.
Data Validation
The first question addressed in this study was: What are the knowledge, motivation, and
organizational needs necessary for the SEESD’s PBIS committee members to achieve their goal
of designing and implementing a comprehensive evaluation system that will monitor and ensure
100% compliance of PBIS interventions addressing its mission? To do this, triangulation was
used to check for credibility and trustworthiness of the research study (Merriam & Tisdell,
2016). Findings from the survey and interviews provided evidence of the gap in the assumed
needs of PBIS committee members to create a comprehensive evaluation system that will
monitor and ensure 100% compliance of PBIS interventions.
Anonymous surveys and interviews were used to clarify findings and learn more about
the design and implementation of a comprehensive evaluation system. Seventy seven survey
responses and 24 interviews were used to assess assumed knowledge and skill, motivation, and
organization for an evaluation system that will monitor and ensure 100% compliance of PBIS
interventions. For further triangulation, documents were collected and analyzed to verify
information that was gathered from surveys and interviews. The results of this process are
presented in the sections that follow.
Criteria for Validation of the Data
In order to create an evaluation system, the assessment of SEESD’s PBIS committee
members needed to meet a certain criteria in the areas of knowledge and skill, motivation, and
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 77
organization. Therefore, a criteria of 80% was established to measure the levels of each assumed
need was 80% due to SEESD’s PBIS committee members not having a comprehensive
evaluation system that monitors and ensures compliance of PBIS interventions. The surveys,
interviews, and document analysis were validated by an 80% criteria as further discussed.
Survey
Data from the survey were reported using descriptive statistics which included the mean
and standard deviation scores. In this study, a criterion of 80% was used as a measure of the gap.
The results were different per knowledge and skill, motivation, and organizational category.
Knowledge category. In the knowledge category, four types of knowledge were
examined in the survey. They were the factual, conceptual, procedural, and metacognitive
knowledge types. The assumption as described in the previous chapters was that there was a
knowledge gap with the PBIS committee members. Therefore, a response rate of 80% and above
on the expected correct answer did not validate the assumed knowledge gap while a knowledge
question with a response rate of below 80% on the expected correct answer validated the
assumed knowledge gap.
Motivation category. In the motivation category, three types of motivation were
examined in the survey. They were value, self-efficacy, and mood motivation types. The
assumption as described in the previous chapters was that there was a motivation gap among
PBIS committee members. Therefore, a response rate of 80% and above on the expected correct
answer did not validate the assumed motivation gap while a motivation question with a response
rate of below 80% on the expected correct answer validated the assumed motivation gap.
Organizational category. In the organizational category, three types of organizational
influences were examined in the survey. They were resources, policies and procedures, and
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 78
culture. The assumption as described in the previous chapters was that there was an
organizational gap at SEESD. Therefore, a response rate of 80% and above on the expected
correct answer did not validate the assumed organizational gap while an organization question
with a response rate of below 80% on the expected correct answer validated the assumed
organizational gap.
Interviews
Data from the semi-structured, one-hour interviews of the certificated staff members who
participated provided rich insight into the knowledge and skill, motivation, and organizational
factors of the assumed influences. In this study, a criteria of 80% was used to resolve
conflicting results from survey responses. Validation of the answers were on common responses
from the participants (19 out of 24 participants), grouped in knowledge types, motivation types,
and the organizational factors. The interviews validated some causes not revealed in the survey
responses; therefore, interview results helped to determine gaps in particular areas indicated
below
Document Analysis
Data from reviewed documents that were presented during interviews were used to verify
and answer underlining questions regarding compliance of PBIS interventions. Data gathered
from the documents included posters with school-wide expectations, meeting notes and
spreadsheets, binders outlining procedures, and incentives for students. Data analysis from
surveys and interviews revealed conflicting responses; therefore, document analysis clarified
particular responses.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 79
Results and Findings for Knowledge Causes
The results and findings of the knowledge causes were reported using the knowledge
categories and assumed knowledge influences for each category. In the knowledge category,
four types of knowledge were examined. They were the factual, conceptual, procedural, and
metacognitive knowledge types as shown in Table 4.
Factual Knowledge
Assumed knowledge influence #1: PBIS committee members need to know the
established goal to evaluate compliance with PBIS interventions.
Survey results. Although the criteria for this item was 80%, as shown in Table 8, only
47% of the participants correctly indicated that the goal to evaluate compliance of PBIS
interventions was to create an evaluation system that measures compliance and effectiveness of
PBIS interventions. Fifty three percent of the participants responded incorrectly, thus indicating
the participants need to know the established goal before designing and implementing a
comprehensive evaluation system that will monitor and ensure 100% compliance of PBIS
interventions. These results demonstrated that improvement can be made for PBIS committee
members in their understanding of the established goal.
Table 8
Survey Results for Committee Members Factual Knowledge of the Established Goal
# Factual Knowledge Item % Count
SEESD’s goal for the evaluation of PBIS is to:
1
Create an evaluation system that measures compliance
and effectiveness of PBIS interventions.*
46.8% 36
2 Build on the evaluation system that is already in place. 26.0% 20
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 80
Table 8 (Cont’d.)
# Factual Knowledge Item % Count
3
Recreate an evaluation system that measures compliance
and effectiveness of PBIS interventions.
5.2% 4
4
Merge evaluation systems into one system that measures
compliance and effectiveness of PBIS interventions.
22.0% 17
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing the established goal of SEESD. Only 18 of the 24 participants provided the
expected response which only met 75% of the established 80% criteria for interview findings.
Participant 16 stated, “the goal is to set behavior expectations, teach and reinforce positive
behaviors, and implement a process that measures effectiveness.” Participant 19 stated, “the
concept is to teach and model positive behaviors to create a positive school culture, hence
creating a safe place for all. The goal is to measure if these approaches are demonstrating
effectiveness across the board.” These results demonstrated that improvement can be made for
PBIS committee members in their understanding of the established goal.
Document analysis. Participants provided and referenced documents and visual aides
which demonstrated a clear understanding of SEESD’s established goal. For instance, some
participants provided documentation indicating the goal was to determine effectiveness of PBIS
interventions during their interviews.
Summary. Survey, interview, and document analysis findings demonstrated there is
some knowledge of the established goal; however, there is not a clear agreement among PBIS
committee members reaching the established 80% criteria.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 81
Assumed knowledge influence #2: PBIS committee members need to know the
components of PBIS.
Survey results. Although the criteria for this item was 80%, as shown in Table 9, only
27% of the participants correctly indicated that the four components of PBIS are to analyze
individual student systems, classroom setting systems, school-wide systems, and district-level
systems. Seventy eight percent of the participants responded incorrectly, thus indicating
participants need to know the four PBIS components before designing and implementing a
comprehensive evaluation system that will monitor and ensure 100% compliance of PBIS
interventions. These results demonstrated that improvement can be made for PBIS committee
members to learn and understand the four components of PBIS.
Table 9
Survey Results for Factual Knowledge of the Four Components of PBIS
# Factual Knowledge Item % Count
The four PBIS components are:
1
Establish expectations, teach expectations, reinforce
expectations, correct behavior.
61.0% 47
2
Intensive, Individual Interventions, Targeted Group
Interventions, Universal Interventions.
9.1% 7
3 Training, coaching, evaluating. 2.6% 2
4
Individual student systems, Classroom setting systems,
School-wide systems, District-level systems.*
27.3% 21
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there seemed to be a clear agreement among
the participants knowing the components of PBIS. Twenty of the 24 participants provided the
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 82
expected response by meeting the 80% established criteria for interview findings. Participant 3
stated, “there are various systems that are part of PBIS which address each student, classroom,
grade level, and all schools across our district.” Participant 10 emphasized, “these systems work
together to create culture shifts at each school, in each classroom, and in every individual.”
These results demonstrated that PBIS committee members have a clear understanding of PBIS
components.
Document analysis. Participants provided and referenced documents and visual aides
which demonstrated a clear understanding of the four PBIS components. For instance, training
materials and documents were provided by participants who were interviewed. These documents
showed diagrams of systems, tiers, and matrices which highlighted each component.
Summary. The survey results indicated 27% of the participants correctly identified the
four components of PBIS whereas interview findings revealed a clear agreement among the
participants by meeting the established 80% criteria. Document analysis provided a clear
understanding of knowledge for each component.
Assumed knowledge influence #3: PBIS committee members need to know what
comprises an effective evaluation system that can monitor and ensure compliance of PBIS
interventions.
Survey results. Although the criteria for this item was 80%, as shown in Table 10, only
66% of the participants indicated that an effective evaluation system is comprised of using
PowerSchool student information system, attendance data, suspension data, expulsion data,
mental health service data, School-wide Information System (SWIS), Self Assessment Survey
(SAS), and Tiered Fidelity Inventory (TFI). Thirty four percent of the participants responded
incorrectly, thus indicating participants need to know what an effective evaluation system is
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 83
comprised of before designing and implementing a comprehensive evaluation system that will
monitor and ensure 100% compliance of PBIS interventions. These results demonstrated that
improvement can be made for PBIS committee members in their understanding of an effective
evaluation system.
Table 10
Survey Results for Understanding an Effective Evaluation System
# Factual Knowledge Item % Count
An effective evaluation system is comprised of:
1
Fidelity of implementation practices, Team self-
assessments, Walk-through observation tools,
Performance evaluations.
23.4% 18
2
Fidelity of implementation practices, Team self-
assessments, Walk-through observation tools,
Performance evaluations.
2.6% 2
3
PowerSchool student information system, School-wide
Information System (SWIS), Walk-through observation
tools, Performance evaluations.
7.8% 6
4
PowerSchool student information system, Attendance
data, Suspension data, Expulsion data, Mental Health
service data, School-wide Information System (SWIS),
Self Assessment Survey (SAS), Tiered Fidelity Inventory
(TFI).*
66.2% 51
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing what comprises an effective evaluation system. Only 15 of the 24
participants provided the expected response which only met 63% of the established 80% criteria
for interview findings. Participant 5 stated, “aside from using PowerSchool information, we also
use attendance, SAS, and TFI data to get a pulse of how we are doing as a school and get a sense
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 84
of where we need to go.” Participant 17 added, “suspension data, expulsion data, and office
referrals give us information on where the needs are to make informed decisions.” These results
demonstrated that improvement can be made for PBIS committee members in their
understanding of an effective evaluation system.
Document analysis. Training materials and documents were provided by some PBIS
committee members who were interviewed which demonstrated a clear understanding of what
comprises an effective evaluation system. For instance, some of them had attendance charts,
suspension and expulsion data posted on their office walls. Some of the participants accessed
digital documents on their computer where they showed lists of students receiving counseling
services and links to SWIS, SAS, and TFI data.
Summary. Survey, interview, and document analysis findings demonstrated there is
some knowledge of what comprises an effective evaluation system; however, there was not a
clear agreement among PBIS committee members reaching the established 80% criteria.
Conceptual Knowledge
Assumed conceptual influence #1: PBIS committee members need to know the
various categories of PBIS to measure improvements and/or areas of growth.
Survey results. Although the criteria for this item was 80%, as shown in Table 11, only
29% of the participants correctly indicated that the four categories to consider when measuring
areas of growth are fidelity of implementation practices, team self-assessments, walk-through
observation tools, and performance evaluations. Seventy one percent of the participants
responded incorrectly, thus indicating participants need to know the four categories to consider
when measuring areas of growth before designing and implementing a comprehensive evaluation
system that will monitor and ensure 100% compliance of PBIS interventions. These results
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 85
demonstrated that improvement can be made for PBIS committee members to learn and
understand the four categories to measure areas of growth.
Table 11
Survey Results to Learn and Understand the Four Categories to Measure Areas of Growth
# Conceptual Knowledge Item % Count
The four categories to consider when measuring areas of
growth are:
1
Fidelity of implementation practices, Team self-
assessments, Walk-through observation tools,
Performance evaluations.*
28.6% 22
2
PowerSchool student information system, Team self-
assessments, Walk-through observation tools,
Performance evaluations.
10.4% 8
3
PowerSchool student information system, School-wide
Information System (SWIS), Walk-through observation
tools, Performance evaluations.
13.0% 10
4
PowerSchool student information system, School-wide
Information System (SWIS), Self Assessment Survey
(SAS), Tiered Fidelity Inventory (TFI).
48.0% 37
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing the categories that measure improvements and/or areas of growth. Only 19
of the 24 participants provided the expected response which only met 79% of the established
80% criteria for interview findings. Participant 5 stated, “I like to visit the classrooms, walk
around the school, and assess individual and team level performance.” Participant 10 stated,
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 86
the way we measure this is by looking at our SAS, TFI, SWIS, and ODR data. As a PBIS
team we talk about what seems to be going well, what areas we need to improve, and
provide feedback on implemented approaches. This is what makes us better.
These results demonstrated that improvement can be made for PBIS committee members to learn
and understand the categories that measure growth.
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results demonstrated there is minimal knowledge of categories to
measure improvements and/or areas of growth, whereas interview findings demonstrated there
was some knowledge of categories almost reaching the established 80% criteria. Overall, there
is not a clear agreement among PBIS committee members reaching the established criteria.
Assumed conceptual influence #2: PBIS committee members need to know the
three-tiered PBIS model.
Survey results. Although the criteria for this item was 80%, as shown in Table 12, only
71% of the participants indicated that PBIS focused on creating and sustaining Tier 1 (universal
interventions for all students); Tier 2 (targeted group support for some students); and Tier 3
(individual support for a few students). Twenty nine percent of the participants responded
incorrectly, thus indicating participants need to know the three-tiers of PBIS supports. These
results demonstrated that an improvement can be made for PBIS committee members to learn
and understand the PBIS model.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 87
Table 12
Survey Results to Learn and Understand the Four Categories to Measure Areas of Growth
# Conceptual Knowledge Item % Count
PBIS focuses on creating and sustaining:
1
Tier 1 (universal interventions for all students); Tier 2
(targeted group support for some students); and Tier 3
(Individual support for a few students).* 71.4% 55
2 Individual students and targeted groups. 2.6% 2
3 Academic systems and behavioral systems. 18.2% 14
4 Intervention strategies. 7.8% 6
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there seemed to be a clear agreement among
the participants knowing the three-tiered PBIS model. Twenty of the 24 participants provided
the expected response by meeting the 80% established criteria for interview findings. Participant
4 stated, “tier one addresses the needs of 80% of the student population, tier two addresses 15%,
and tier three addresses 5%. Each tier is targeted and builds on each other to create a positive
school environment.” Participant 18 emphasized,
all three tiers provide behavior support to all students based on need. For example, the
primary tier provides a universal approach, the secondary tier provides interventions for
some students, and the tertiary tier provides intensive interventions for individual
students. At the end of the day, everyone receives some form of intervention.
These results demonstrated that PBIS committee members have a clear understanding of the
three-tiered PBIS model.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 88
Document analysis. Documents were not provided nor analyzed.
Summary. The survey results indicated 71% of the participants correctly identified the
three-tiers of PBIS supports and interview findings revealed and supported a clear agreement
among the participants by meeting the established 80% criteria. Documents were not analyzed;
however, the survey and interview findings provided enough clarity of conceptual knowledge.
Assumed conceptual influence #3: PBIS committee members need to know the
system of PBIS.
Survey results. As shown in Table 13, 84% of the participants correctly indicated that
PBIS foundational school-wide systems feature implementation that is clear and a priority to the
district, resources that support implementation, and alignment and integration of PBIS with other
district priorities, needs, and initiatives, therefore demonstrating their correct conceptual
knowledge. This percentage exceeded the established criteria of 80%. However, this also
showed that some improvement can be made for PBIS committee members to learn and
understand the system of PBIS as 16% noted otherwise. The assumed influence was that the
PBIS committee members do not know the system of PBIS.
Table 13
Survey Results for Conceptual Knowledge of PBIS Committee Members
# Conceptual Knowledge Item % Count
Foundational school-wide systems feature:
1
PBIS implementation that is clear and a priority to the
district. 7.8% 6
2 Resources that support implementation. 1.3% 1
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 89
Table 13 (Cont’d.)
# Conceptual Knowledge Item % Count
3
Alignment and integration of PBIS with other district
priorities, needs, and initiatives.
6.5% 5
4 All of the above.* 84.4% 65
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing the system of PBIS. Only 15 of the 24 participants provided the expected
response which only met 63% of the established 80% criteria for interview findings. Participant
15 stated,
I am not so sure what essential features are needed when implementing a school-wide
PBIS system because I am in the classroom; however, I do know there are recognitions
and incentives for those who demonstrate positive behaviors. I also know there is student
behavior data that is tracked somehow.
Participant 21 indicated, “what is needed is adult-buy-in, time, and resources. Without
these, it is very difficult to implement a school-wide system.” These results demonstrated that
improvement can be made for PBIS committee members to learn and understand the system of
PBIS.
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results demonstrated there is knowledge of foundational school-wide
PBIS systems by exceeding the established criteria of 80%. On the other hand, interview
findings demonstrated there is some knowledge of the essential features needed by only reaching
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 90
63% of the established 80% criteria. Overall, there was not a clear agreement among PBIS
committee members reaching the established criteria.
Procedural Knowledge
Assumed procedural influence #1: PBIS committee members need to know how to
implement an evaluation system.
Survey results. As shown in Table 14, 84% of the participants correctly indicated they
evaluate when they need to ensure there is a strong foundation before expanding or developing,
assessing for readiness and commitment from those participating, and determining if ongoing
training is needed to improve efficiency and effectiveness of PBIS interventions. This
percentage exceeded the established criteria of 80%. However, this also showed that an
improvement can be made for PBIS committee members to learn how to implement an
evaluation system as 16% noted otherwise. The assumed influence was that the PBIS committee
members do not know how to implement an evaluation system.
Table 14
Survey Results for Procedural Knowledge of PBIS Committee Members
# Procedural Knowledge Item % Count
I evaluate when:
1
I need to ensure there is a strong foundation before I
expand or develop. 9.1% 7
2
I need to assess for readiness and commitment from
those participating. 1.3% 1
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 91
Table 14 (Cont’d.)
# Procedural Knowledge Item % Count
3
I need to determine if ongoing training is needed to
improve efficiency and effectiveness. 5.2% 4
4 All of the above.* 84.4% 65
Total 100.0% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing how to implement an evaluation system. Only 14 of the 24 participants
provided the expected response which only met 58% of the established 80% criteria for interview
findings. Participant 7 stated, “I observe students that frequently get in trouble and analyze the
amount of times they report to the principal’s office.” Participant 8 indicated, “it is up to the
administrators, the evaluation occurs when the principal or assistant principal send a survey to
us.” These results demonstrated that improvement can be made for PBIS committee members to
learn know how to implement an evaluation system on their own.
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results demonstrated there was procedural knowledge as indicated by
84% of responses exceeding the established 80% criteria. On the other hand, interview findings
demonstrated there was minimal knowledge on when and how to create an evaluation system by
only reaching 58% of the established 80% criteria. Overall, there was not a clear agreement
among PBIS committee members reaching the established criteria for procedural knowledge.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 92
Assumed procedural influence #2: PBIS committee members need to know how to
analyze the results of the PBIS evaluation system.
Survey results. Although the criteria for this item was 80%, as shown in Table 15, only
78% of the participants correctly indicated they analyze results by organizing, reviewing,
sharing, and monitoring the data of PBIS interventions. Twenty two percent of the participants
responded incorrectly, thus demonstrating the need to know how to analyze results before
designing and implementing a comprehensive evaluation system that will monitor and ensure
100% compliance of PBIS interventions. These results showed that an improvement can be
made for PBIS committee members to learn and understand how to analyze PBIS intervention
results.
Table 15
Survey Results for PBIS Committee Members to Analyze Intervention Results
# Procedural Knowledge Item % Count
When I analyze results, I:
1 Organize and store the data. 1.3% 1
2 Organize, review, share, and monitor the data.* 77.9% 60
3 Review, share, and monitor the data. 15.6% 12
4 Share and monitor the data. 5.2% 4
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing how to analyze the results of PBIS. Only 11 of the 24 participants
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 93
provided the expected response which only met 46% of the established 80% criteria for interview
findings. Participant 11 stated,
I was trained how to access results from the databases but I do not know how to analyze
them. If I wanted particular data, I could always ask the front office staff to gather it for
me; however, I do not analyze results to determine effectiveness.
Participant 22 indicated, “I know referrals go to the front office staff but I do not know what
happens to the data, I really do not know how I would evaluate results to determine if progress
was made.” These results demonstrated improvement can be made for PBIS committee
members to learn the process of obtaining results and examining them to make determinations.
Document analysis. Participants referenced websites, online accounts, and documents
which demonstrated an understanding on how to obtain PBIS evaluation results. For example, a
participant opened their laptop during the interview and demonstrated how they would obtain
particular results in order to make determinations. This step-by-step process demonstrated their
ability to access and analyze data results.
Summary. Survey results indicated 78% of the participants know how to analyze results
by organizing, reviewing, sharing, and monitoring the data of PBIS interventions. However,
interview findings demonstrated there is minimal knowledge on how to obtain PBIS evaluation
results and determine effectiveness as indicated by 46% of the participants. Overall, there was
not a clear agreement among PBIS committee members reaching the established 80% criteria for
procedural knowledge.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 94
Assumed procedural influence #3: PBIS committee members need to be able to
incorporate PBIS interventions into the schools.
Survey results. Although the criteria for this item was 80%, as shown in Table 16, only
75% of the participants correctly indicated that effective PBIS interventions could result in
revising intervention practices and/or monitoring implementation integrity, thus demonstrating
their correct procedural knowledge. Twenty five percent of the participants responded
incorrectly, indicating participants need to learn and understand how to determine effectiveness
of PBIS interventions before designing and implementing a comprehensive evaluation system.
These results demonstrated improvement can be made for PBIS committee members in learning
and understanding how to incorporate PBIS interventions into the schools.
Table 16
Survey Results for Procedural Knowledge of PBIS Committee Members
# Procedural Knowledge Item % Count
Effective PBIS interventions could result in:
1
Revisions to intervention practices and/or monitoring
implementation integrity.* 75.3% 58
2 Designing a new intervention. 3.9% 3
3
Implementing a new program and/or monitoring
implementation integrity. 14.3% 11
4 Changing the structure of the program. 6.5% 5
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants knowing how to incorporate PBIS interventions into the schools. Only 14 of the 24
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 95
participants provided the expected response which only met 58% of the established 80% criteria
for interview findings. Participant 3 stated, “I use data results to inform staff of where we are as
a school and also to help facilitate discussions regarding best practices for PBIS. I honestly do
not do this as often as I would like.” Participant 23 indicated, “sometimes I look at the data to
make informed decisions on where to allocate resources for my school. I don’t do it as often as I
would like.” These results demonstrated improvement can be made for PBIS committee
members to use data results to influence the way PBIS interventions are incorporated into the
schools.
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results indicated 75% of the participants were able to determine
effectiveness of PBIS interventions. Additionally, interview findings demonstrated there was
minimal knowledge on how to use data results to influence PBIS interventions in the schools as
indicated by 58% of the participants. Overall, there is not a clear agreement among PBIS
committee members reaching the established 80% criteria for procedural knowledge.
Metacognitive Knowledge
Assumed metacognitive influence #1: PBIS committee members need to know how
to reflect on their own abilities to create their own comprehensive evaluation system.
Survey results. Although the criteria for this item was 80%, as shown in Table 17, only
75% of the participants correctly indicated they self-evaluate by regularly monitoring their
progress towards established goals, think through alternatives before making decisions, and
motivate themselves. Twenty five percent of the participants responded incorrectly, thus
demonstrating participants’ lack of metacognitive knowledge. These results demonstrated that
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 96
improvement can be made for PBIS committee members to learn and understand how to self-
evaluate by reflecting on their own abilities.
Table 17
Survey Results for Metacognitive Knowledge of PBIS Committee Members to Self-evaluate by
Reflecting on Their Own Abilities
# Metacognitive Knowledge Item % Count
I self-evaluate by:
1 Regularly monitoring my progress towards my goals. 14.3% 11
2
Thinking through alternatives before determining my
answer. 9.1% 7
3 Motivating myself. 1.3% 1
4 All of the above.* 75.3% 58
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants reflecting on their own abilities to create their own comprehensive evaluation
system. Only 12 of the 24 participants provided the expected response which only met 50% of
the established 80% criteria for interview findings. Participant 5 stated, “we reference the PBIS
binder we created for our school, that is how we are able to evaluate our own effectiveness.”
Participant 16 indicated, “I do not know, the assistant principal is typically the one who evaluates
our effectiveness.” These results demonstrated improvement can be made for PBIS committee
members to evaluate their own abilities instead of depending on other sources.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 97
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results indicated 75% of the participants were able to evaluate the
effectiveness of their own abilities. Similarly, interview findings demonstrated there was
minimal metacognitive knowledge on how to self-evaluate as indicated by 50% of the
participants. Overall, there was not a clear agreement among PBIS committee members reaching
the established 80% criteria for metacognitive knowledge.
Assumed metacognitive influence #2: PBIS committee members need to reflect on
and evaluate their own performance.
Survey results. As shown in Table 18, 88% of the participants correctly indicated that
they evaluate their performance by providing feedback, discussing what is working and what is
not working, and making adjustments, thus demonstrating their correct metacognitive
knowledge. This percentage exceeded the established criteria of 80%. However, this also
showed that an improvement can be made for PBIS committee members to learn and understand
how to evaluate the team’s performance as 12% noted otherwise. The assumed influence was
that the PBIS committee members do not know how to reflect on and evaluate their own
performance.
Table 18
Survey Results for PBIS Committee Members to Reflect on and Evaluate Their Own
Performance
# Metacognitive Knowledge Item % Count
We (as a group) evaluate our own performance by:
1 Providing feedback. 1.3% 1
2 Discussing what is working and what is not working. 10.4% 8
3 Making adjustments.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 98
Table 18 (Cont’d.)
# Metacognitive Knowledge Item % Count
4 All of the above.* 88.3% 68
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants reflecting and evaluating their own performance. Only eight of the 24 provided the
expected response which only met 33% of the established 80% criteria for interview findings.
Participant 14 stated, “I try to look at patterns and inconsistencies as a committee. Then, I
provide feedback to those who need extra support.” Participant 17 indicated, “I like to have
conversations with individuals; I have not addressed the committee just yet.” These results
demonstrated improvement can be made for PBIS committee members to evaluate their abilities
as a committee rather than doing it in isolation.
Document analysis. Documents were not provided nor analyzed.
Summary. Survey results indicated 88% of the participants evaluated their performance
as a team by providing feedback, discussing what is working and what is not working, and
making adjustments. On the other hand, interview results indicated 33% of the participants
evaluate their performance as a team. These results indicated there is not a clear agreement
among PBIS committee members reaching the established 80% criteria for metacognitive
knowledge.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 99
Results and Findings for Motivation Causes
Value
Assumed motivation influence #1: PBIS committee members need to value
designing and implementing a comprehensive evaluation system to monitor and ensure
compliance of PBIS interventions.
Survey results for designing. Although the criteria for this item was 80%, as shown in
Table 19, only 49% [20.8%+28.6% (important and most important)=49.4% (49%)] of the
participants correctly indicated that designing an evaluation plan for PBIS interventions was
important to them. Fifty seven percent of the participants responded incorrectly, thus indicating
that designing an evaluation plan for PBIS interventions was not important to them. These
results demonstrated that the participants lack value on designing a comprehensive evaluation
system that will monitor and ensure 100% compliance of PBIS interventions.
Table 19
Survey Results for Motivation of PBIS Committee Members
# Motivation Item % Count
Put this sentence in order of what is most important to
you (1- most important to 4- least important): Designing
an evaluation plan for PBIS interventions.
1 Most important* 20.8% 16
2 28.6% 22
3 33.8% 26
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 100
Table 19 (Cont’d.)
# Motivation Item % Count
4 Least important 16.8% 13
Total 100% 77
*Correct response
Survey results for implementing. Although the criteria for this item was 80%, as shown
in Table 20, only 56% of the participants correctly indicated that implementing an evaluation
plan for PBIS interventions was important to them. Forty four percent of the participants
responded incorrectly, thus demonstrating their level of motivation and value. These results
demonstrated that the participants did not place value on implementing a comprehensive
evaluation system to monitor and ensure 100% compliance of PBIS interventions.
Table 20
Survey Results for Motivation of PBIS Committee Members
# Motivation Item % Count
Put this sentence in order of what is most important to you
(1- most important to 4- least important): Implementing
an evaluation plan for PBIS interventions.
1
Most important*
22.1% 17
2 33.8% 26
3 26.0% 20
4 Least important 18.1% 14
Total 100% 77
*Correct response
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 101
Survey results for monitoring. Although the criteria for this item was 80%, as shown in
Table 21, only 61% of the participants indicated that monitoring progress of PBIS interventions
was important to them. Thirty nine percent of the participants responded incorrectly, thus
demonstrating their motivation and value. These results demonstrated that the participants did
not place value on monitoring progress for a comprehensive evaluation system to ensure
compliance of PBIS interventions.
Table 21
Survey Results for Motivation of PBIS Committee Members
# Motivation Item % Count
Put this sentence in order of what is most important to
you (1- most important to 4- least important):
Monitoring progress of PBIS interventions
1
Most important*
37.7% 29
2 23.4% 18
3 24.7% 19
4 Least important 14.2% 11
Total 100% 77
*Correct response
Survey results for determining effectiveness. Although the criteria for this item was
80%, as shown in Table 22, only 60% of the participants correctly indicated that determining
effectiveness of PBIS interventions was important to them. Forty percent of the participants
responded incorrectly, thus demonstrating their level of motivation and value. These results
demonstrated that the participants did not place value on determining effectiveness for a
comprehensive evaluation system to ensure 100% compliance of PBIS interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 102
Table 22
Survey Results for Motivation of PBIS Committee Members
# Motivation Item % Count
Put this sentence in order of what is most important to
you (1- most important to 4- least important):
Determining effectiveness of PBIS interventions.
1 Most important* 44.2% 34
2 15.6% 12
3 15.6% 12
4 Least important 24.6% 19
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there seemed to be a clear agreement among
the participants valuing the design and implementation of a comprehensive evaluation system
that will monitor and ensure compliance of PBIS interventions. Twenty of the 24 participants
provided the expected response by meeting the 80% established criteria for interview findings.
Participant 13 stated,
there is value to all the expectations we place for our students and teachers, the
interventions we put in place to assist with challenging behaviors, and the amount of
work we devote to our school. It takes time but it certainly works.
Participant 18 indicated, “when we all speak the same language, set high standards for
students and staff, and there is follow-through, we see positive changes in school culture. PBIS
interventions work. Now we need to quantify that it does.” These results demonstrated
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 103
improvement can be made among PBIS committee members to place value in the design and
implementation of a comprehensive evaluation system.
Observation analysis. Some of the participants demonstrated they were invested in PBIS
by their choice of words and examples provided when speaking of the reasons why the
committee should design and implement an evaluation system. In addition, the time they took to
answer the interview questions demonstrated a level of engagement that was not observed when
answering other questions.
Summary. Survey results indicated there was not a clear agreement among PBIS
committee members reaching the established 80% criteria for value. On the other hand,
interview findings demonstrated 80% of the participants value the design and implementation of
a comprehensive evaluation system. In addition, observations made during the interviews
indicated a clear agreement of the participants being invested and placing value in evaluation.
Overall, these results indicated there is a clear agreement among PBIS committee members
reaching the established 80% criteria for value.
Self-Efficacy
Assumed motivation influence #2: PBIS committee members need to have
confidence that they can create an evaluation system that will monitor and ensure
compliance of PBIS interventions.
Survey results for influencing. Although the criteria for this item was 80%, as shown in
Table 23, only 40% of the participants correctly indicated that they feel confident influencing
students, parents, and administrators to comply with PBIS interventions. Thirty six percent of
the participants indicated feeling neutral and 23% of the participants indicated they do not feel
confident influencing students, parents, and administrators to comply with PBIS interventions,
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 104
thus indicating participants need to build their self-efficacy to create a comprehensive evaluation
system. These results demonstrated where PBIS committee members’ self-efficacy is when
influencing individuals to comply with PBIS interventions. Therefore, the survey results did
validate this assumed influence.
Table 23
Survey Results for Motivation of PBIS Committee Members
# Motivation Item % Count
Rate your degree of confidence in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I can influence students, parents, and
administrators to comply with PBIS interventions.
1 Strongly agree* 10.4% 8
2 29.9% 23
3 36.4% 28
4 18.2% 14
5 Strongly disagree 5.1% 4
Total 100% 77
*Correct response
Survey results for planning. Although the criteria for this item was 80%, as shown in
Table 24, only 78% of the participants correctly indicated that they felt confident planning and
meeting deadlines, thus demonstrating their level of motivation and self-efficacy. Eight percent
of the participants remained neutral and 14% of the participants responded incorrectly; therefore
indicating PBIS committee members did not have high levels of self-efficacy to create a
comprehensive evaluation system to ensure compliance of PBIS interventions. These results
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 105
showed that an improvement can be made for PBIS committee members to become self-
efficacious when planning and meeting deadlines to create an evaluation system.
Table 24
Survey Results for Planning and Meeting Deadlines
# Motivation Item % Count
Rate your degree of confidence in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: If I plan accordingly, I can meet
deadlines.
1 Strongly agree* 58.4% 45
2 19.5% 15
3 7.8% 6
4 6.5% 5
5 Strongly disagree 7.8% 6
Total 100% 77
*Correct response
Survey results for designing. As shown in Table 25, 27% of the participants correctly
indicated that they felt confident in their ability to design an evaluation plan for PBIS. Forty
three percent of the participants indicated feeling neutral and 30% of the participants indicated
they did not feel confident designing an evaluation plan for PBIS, hence indicating participants
needed to work on building their confidence to create a comprehensive evaluation system. In
essence, these results demonstrated that PBIS committee members need to build their confidence
in order to feel comfortable designing a comprehensive evaluation plan that will monitor and
ensure compliance of PBIS interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 106
Table 25
Survey Results for Designing a Comprehensive Evaluation Plan
# Motivation Item % Count
Rate your degree of confidence in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I am confident in my ability to design
an evaluation plan for PBIS.
1 Strongly agree* 7.8% 6
2 19.5% 15
3 42.9% 33
4 20.8% 16
5 Strongly disagree 9.0% 7
Total 100% 77
*Correct response
Survey results for monitoring. As shown in Table 26, 57% of the participants correctly
indicated that they felt confident monitoring PBIS interventions once they have been
implemented. Thirty one percent of the participants indicated feeling neutral and 12% of the
participants indicated they did not feel confident monitoring an evaluation plan for PBIS, hence
indicating participants need to work on increasing their level of motivation and confidence. In
essence, these results demonstrated that PBIS committee members need to increase their
motivation and confidence levels to feel comfortable monitoring a comprehensive evaluation
plan that will ensure compliance of PBIS interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 107
Table 26
Survey Results for Monitoring an Evaluation Plan for PBIS
# Motivation Item % Count
Rate your degree of confidence in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I am confident in my ability to monitor
PBIS interventions once they have been implemented.
1 Strongly agree* 15.6% 12
2 41.6% 32
3 31.2% 24
4 10.4% 8
5 Strongly disagree 1.2% 1
Total 100% 77
*Correct response
Survey results for assessing effectiveness. As shown in Table 27, 59% of the
participants correctly indicated that they felt confident in their abilities to assess effectiveness of
PBIS interventions. Thirty three percent of the participants indicated feeling neutral and 9% of
the participants indicated they did not feel confident assessing effectiveness of an evaluation plan
for PBIS, hence demonstrating their motivation and confidence levels. In essence, these results
demonstrated that PBIS committee members need to increase their motivation and confidence
levels to feel comfortable assessing effectiveness of a comprehensive evaluation plan.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 108
Table 27
Survey Results for Assessing the Effectiveness of PBIS Interventions
# Motivation Item % Count
Rate your degree of confidence in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I am confident in my ability to assess
effectiveness of PBIS interventions.
1 Strongly agree* 16.9% 13
2 41.6% 32
3 32.5% 25
4 6.5% 5
5 Strongly disagree 2.5% 2
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants feeling confident about their own abilities to create an evaluation system that would
monitor and ensure compliance of PBIS interventions. Only 11 of the 24 provided the expected
response which only met 46% of the established 80% criteria for interview findings. Participant
11 indicated, “I have very little confidence because I have not learned how to monitor and ensure
compliance of our interventions. I am sure if I had some experience, I would feel better
answering this question.” Participant 19 stated, “I don’t have experience with this, therefore I
don't feel comfortable monitoring compliance on my own. However, I am comfortable being
part of this because I do see the importance of it at our school and district.” These results
demonstrated improvement can be made for PBIS committee members to improve their
confidence levels to design, monitor, and ensure compliance of PBIS interventions as a team.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 109
Observation analysis. Some of the participants demonstrated their confidence to create
an evaluation system that will monitor and ensure compliance of PBIS interventions by their tone
of voice, their manners and behaviors, and examples they provided to describe current and prior
experiences. Some participants chose to engage in the conversation and provided additional
information and examples when answering interview questions.
Summary. Survey results indicated there was not a clear agreement among PBIS
committee members reaching the established 80% criteria for self-efficacy. Interview findings
reinforced what was reported in the surveys, and observations made it clear that the participants
are not self-efficacious and do not feel confident in their own abilities to create an evaluation
system. Overall, these results indicated there is not a clear agreement among PBIS committee
members reaching the established 80% criteria for self-efficacy.
Mood
Assumed motivation influence #3: PBIS committee members need to feel positive
about designing and implementing a comprehensive evaluation system.
Survey results for designing. Although the criteria for this item was 80%, as shown in
Table 28, only 33% of the participants correctly indicated that they felt positive about designing
a comprehensive evaluation system. Forty two percent of the participants indicated feeling
neutral and 26% of the participants indicated they did not feel positive designing a
comprehensive evaluation system, thus indicating the participants need to increase their
motivation and mood by creating a positive outlook when designing an evaluation system.
These results demonstrated that improvement can be made for PBIS committee members in their
level of positivity when designing a comprehensive evaluation system.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 110
Table 28
Survey Results for Designing a Comprehensive Evaluation System
# Motivation Item % Count
Rate your degree of positivity in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I feel positive about designing a
comprehensive evaluation system.
1 Strongly agree* 9.1% 7
2 23.4% 18
3 41.6% 32
4 14.3% 11
5 Strongly disagree 11.6% 9
Total 100% 77
*Correct response
Survey results for implementing. Although the criteria for this item was 80%, as shown
in Table 29, only 36% of the participants correctly indicated that they felt positive when
implementing a comprehensive evaluation system. Forty four percent of the participants
indicated feeling neutral and 19% of the participants indicated they do not feel positive
implementing a comprehensive evaluation system; therefore, indicating the participants need to
increase their motivation and mood by creating a positive outlook when implementing an
evaluation system. These results demonstrated that improvement can be made for PBIS
committee members in their level of positivity when implementing a comprehensive evaluation
system.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 111
Table 29
Survey Results for Implementing a Comprehensive Evaluation System
# Motivation Item % Count
Rate your degree of positivity in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I feel positive when implementing a
comprehensive evaluation system.
1 Strongly agree* 10.4% 8
2 26.0% 20
3 44.2% 34
4 15.6% 12
5 Strongly disagree 3.8% 3
Total 100% 77
*Correct response
Survey results for monitoring. Although the criteria for this item was 80%, as shown in
Table 30, only 40% of the participants correctly indicated that they felt positive when monitoring
a comprehensive evaluation system. Thirty nine percent of the participants indicated feeling
neutral and 21% of the participants indicated they do not feel positive monitoring a
comprehensive evaluation system, thus indicating the participants needed to increase their
motivation and mood by creating a positive outlook when monitoring an evaluation system.
These results demonstrated that improvement can be made for PBIS committee members’ level
of positivity when monitoring a comprehensive evaluation system.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 112
Table 30
Survey Results for Monitoring a Comprehensive Evaluation System
# Motivation Item % Count
Rate your degree of positivity in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I feel positive when monitoring a
comprehensive evaluation system.
1 Strongly agree* 13% 10
2 27.3% 21
3 39.0% 30
4 15.6% 12
5 Strongly disagree 5.1% 4
Total 100% 77
*Correct response
Survey results for determining effectiveness. Although the criteria for this item was
80%, as shown in Table 31, only 48% of the participants correctly indicated that they felt
positive when determining effectiveness of PBIS interventions. Thirty nine percent of the
participants indicated feeling neutral and 13% of the participants indicated they did not feel
positive determining effectiveness of PBIS interventions; as a result indicating the participants
need to increase their motivation and mood by creating a positive outlook when determining
effectiveness of PBIS interventions. These results demonstrated that improvement can be made
for PBIS committee members’ level of positivity when determining effectiveness.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 113
Table 31
Survey Results for Determining Effectiveness of PBIS Interventions
# Motivation Item % Count
Rate your degree of positivity in doing the following by
indicating 1- strongly agree to 5- strongly disagree using
the scale below: I feel positive when determining
effectiveness of PBIS interventions.
1 Strongly agree* 10.4% 8
2 37.7% 29
3 39.0% 30
4 10.4% 8
5 Strongly disagree 2.5% 2
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants feeling positive about designing and implementing a comprehensive evaluation
system that will monitor and ensure compliance of PBIS interventions. Only 16 of the 24
provided the expected response which only met 67% of the established 80% criteria for interview
findings. Participant 9 indicated, “I am not sure how I feel about implementing and monitoring
PBIS interventions; I think that is the responsibility of the administrators. There are so many
other things we [teachers] need to do.” Participant 16 stated, “I honestly don’t believe in PBIS
so if you ask me, I don’t feel a need to design, implement, monitor, or determine effectiveness of
PBIS interventions.” These results demonstrated improvement can be made for PBIS committee
members to improve their positivity regarding the design, implementation, monitoring, and
effectiveness of PBIS interventions as a team.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 114
Document analysis. Documents were not provided nor analyzed.
Observation analysis. Some of the participants demonstrated their positivity and
willingness to design, implement, monitor, and determine effectiveness of PBIS interventions by
their tone of voice, their manners and behaviors, and examples they provided to describe
experiences. Some participants chose to engage in the conversation and provided additional
information and examples when answering interview questions.
Summary. Survey results indicated there was not a clear agreement among PBIS
committee members reaching the established 80% criteria for mood. Interview findings
reinforced what was reported in the surveys, and observations made it clear that the participants
do not have high levels of positivity regarding the design, implementation, monitoring, and
effectiveness of PBIS interventions. The participants’ mood to engage in this task was not
positive. Overall, these results indicated there is not a clear agreement among PBIS committee
members reaching the established 80% criteria for mood.
Results and Findings for Organization Causes
Organization/Culture/Context
Resources. Assumed organization influence #1: PBIS committee members need
time to create an effective comprehensive evaluation system.
Survey results. Although the criteria for this item was 80%, as shown in Table 32, only
42% of the participants correctly indicated that their school site provides planning time to design,
implement, and monitor compliance of PBIS interventions. Thirty six percent of the participants
reported they neither agreed or disagreed with the statement and 22% of the participants
indicated they strongly disagreed, thus indicating the participants need more time to design,
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 115
implement, and monitor compliance of PBIS interventions. These results demonstrated that
PBIS committee members need time to create an effective comprehensive evaluation system.
Table 32
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent which you agree with the
statement (1- strongly agree to 5- strongly disagree):My
school site provides planning time to design, implement,
and monitor compliance of PBIS interventions.
1 Strongly agree* 15.6% 12
2 26% 20
3 36.4% 28
4 13.0% 10
5 Strongly disagree 9.0% 7
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants stating there is not enough time to create an effective comprehensive evaluation
system. Only 12 of the 24 provided the expected response which only met 50% of the
established 80% criteria for interview findings. Participant 17 stated, “we try to meet once per
month” and Participant 21 indicated, “we meet as a committee six times per year.” However,
Participant 13 stated, “we meet on a weekly basis for one hour.” When asked how they utilized
their meeting time, Participant 13 elaborated by indicating, “our meetings are purposeful and
structured; we analyze data, identify problems, needs, solutions, and school wide approaches.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 116
We maximize our time when we meet.” On the other hand, Participant 17 stated, “we meet to
plan events.” These results demonstrated improvement can be made for PBIS committee
members to prioritize time and strategically plan to create an effective comprehensive evaluation
system.
Document analysis. Documents were not provided nor analyzed.
Observation analysis. Some of the participants demonstrated their satisfaction and/or
frustrations regarding resources [time] using their tone of voice, body language, and examples
provided to describe experiences. Some participants chose to engage in the conversation and
provided additional information and examples when answering the interview question.
Summary. Survey results indicated there was not a clear agreement among PBIS
committee members reaching the established 80% criteria for resources (i.e., time). Interview
findings reinforced what was reported in the surveys, and observations made it clear that the
participants did not have time to create a comprehensive evaluation system. Overall, these
results indicated there is not a clear agreement among PBIS committee members reaching the
established 80% criteria for resources.
Assumed organization influence #2: PBIS committee members need individuals to
commit to design, implement, and monitor an effective comprehensive evaluation system.
Survey results. Although the criteria for this item was 80%, as shown in Table 33, only
69% of the participants correctly indicated that the administrative team at their school sites
support their efforts to design, implement, and monitor an effective comprehensive evaluation
system for PBIS. Sixteen percent of the participants remained neutral and 16% of the
participants responded incorrectly, thus demonstrating the organization was not supportive of
their efforts. These results demonstrated that improvement can be made for PBIS committee
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 117
members to commit to design, implement, and monitor an effective comprehensive evaluation
system.
Table 33
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent which you agree with the
statement (1- strongly agree to 5- strongly disagree): The
administrative team at my school site supports my
efforts.
1 Strongly agree* 50.6% 39
2 18.2% 14
3 15.6% 12
4 11.7% 9
5 Strongly disagree 3.9% 3
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants with regard to their level of commitment to design, implement, and monitor an
effective comprehensive evaluation system. Only 11 of the 24 the participants provided the
expected response which only met 46% of the expected 80% criteria for interview findings.
Participant 16 stated, “we do not have enough time to meet or plan, it is very limited because are
focusing on other initiatives that are more important right now.” Participant 23 indicated, “I
make sure to allocate time, explore new ideas, and ways we will be delivering PBIS
interventions across our site.” These results demonstrated improvement can be made for PBIS
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 118
committee members to prioritize and commit to create an effective comprehensive evaluation
system.
Document analysis. Documents were not provided nor analyzed.
Observation analysis. Some of the participants demonstrated their satisfaction and/or
frustrations regarding resources [commitment] using their tone of voice, body language, and
examples provided to describe experiences. Some participants chose to engage in the
conversation and provided additional information and examples when answering the interview
question.
Summary. Survey results indicated there was not a clear agreement among PBIS
committee members reaching the established 80% criteria for resources (i.e., commitment).
Interview findings reinforced what was reported in the surveys, and observations made it clear
that the participants are not committed to creating a comprehensive evaluation system. Overall,
these results indicated there is not a clear agreement among PBIS committee members reaching
the established 80% criteria for resources.
Policies, Processes, and Procedures
Assumed organization influence #3: PBIS committee members need to have policies
that align with the goal of the school district.
Survey results. Although the criteria for this item was 80%, as shown in Table 34, only
56% of the participants correctly indicated that the district policies align with PBIS implemented
interventions. Twenty five percent of the participants remained neutral and 19% of the
participants responded incorrectly, thus demonstrating the policies do not align with the goal of
the SEESD. These results demonstrated that improvement can be made for PBIS committee
members to further align the districts policies with PBIS implemented interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 119
Table 34
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): The
district policies align with our PBIS implemented
interventions.
1 Strongly agree* 22.1% 17
2 33.8% 26
3 24.7% 19
4 14.3% 11
5 Strongly disagree 5.1% 4
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there was not a clear agreement among the
participants having policies that align with the goal of the school district. Only 6 of the 24
participants provided the expected response which only met 25% of the expected 80% criteria for
interview findings. Participant 3 stated, “in my opinion, the district’s policies do align with
SEESD’s goal. It is up to individuals to read and know the policies that are in place.”
Participant 24 indicated, “there are policies that align with the goal of creating a safe place for all
students and families, so everyone is engaged in school and ready to learn.” These results
demonstrated improvement can be made for PBIS committee members to make a conscious
effort and have policies align with the goal of the SEESD.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 120
Document analysis. Some participants had the district's goals posted on the wall but
only a few referenced them when discussing district policies.
Summary. Survey results indicated there is not a clear agreement among PBIS
committee members reaching the established 80% criteria for Policies, Processes, and
Procedures. Interview findings reinforced what was reported in the surveys, and observations
made it clear that the participants did not have policies that align with the goal of SEESD.
Overall, these results indicated there is not a clear agreement among PBIS committee members
reaching the established 80% criteria for policies.
Cultural Settings
Assumed organization influence #4: PBIS committee members need to be part of a
culture that supports PBIS interventions that align with the mission of investing in optimal
learning environments to enhance safety and create positive school climates.
Survey results. Although the criteria for this item was 80%, as shown in Table 35, only
62% of the participants indicated that the districts goal of investing in optimal learning
environments that enhance safety and create positive school climates align with PBIS
interventions at their school sites. Sixteen percent of the participants remained neutral and 22%
of the participants responded incorrectly, thus demonstrating the organization is not supportive
of school climate. These results showed that an improvement can be made for PBIS committee
members to enhance safety and promote positive school climates at their school sites.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 121
Table 35
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): The
district's goal of investing in optimal learning
environments that enhance safety and create positive
school climates aligns with PBIS interventions at my
school site.
1 Strongly agree* 29.9% 23
2 32.5% 25
3 15.6% 12
4 14.3% 11
5 Strongly disagree 7.7% 6
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there is not a clear agreement among the
participants being part of a culture that supports PBIS interventions that align with the mission of
investing in optimal learning environments to enhance safety and create positive school climates.
Only 15 of the 24 the participants provided the expected response which only met 63% of the
expected 80% criteria for interview findings. Participant 9 indicated, “if we want to implement
PBIS we can, if not we don’t have to. They have some slips you can give to students for good
behavior but the students don’t care.” Participant 12 stated, “we have worked hard to embed
PBIS to our school culture . . . the messaging, the way we speak to the students, the expectations
we set for our students, and the way our school looks.” These results demonstrated improvement
can be made for PBIS committee members to create and demonstrate a culture of PBIS.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 122
Document analysis. Documents were not provided nor analyzed.
Observation analysis. Some of the participants demonstrated their support of PBIS
interventions by referencing agenda items, notes, and behavior expectations. Some participants
had posters inside their classrooms or talked about the ones on the hallways serving as a
reminder to everyone.
Summary. Survey results indicated that the district’s goal does not align with PBIS
interventions at each school site. Interview findings reinforced what was reported in the surveys;
however, observations provided a different point of view. Through observations, it was evident
there was a culture of PBIS at each school. Overall, survey and interview outcomes indicated
there is not a clear agreement among PBIS committee members reaching the established 80%
criteria for culture.
Assumed organization influence #5: PBIS committee members need to be part of a
culture that values monitoring and accountability, ensuring that PBIS interventions are
being implemented.
Survey results for district value and recognition. Although the criteria for this item was
80%, as shown in Table 36, only 53% of the participants correctly indicated that the district
values and recognizes schools that apply PBIS interventions. Twenty two percent of the
participants remained neutral and 25% of the participants responded incorrectly; therefore,
demonstrating the organization is not supportive nor values school culture. These results
demonstrated that an improvement can be made for PBIS committee members to instill a culture
that values and recognizes PBIS interventions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 123
Table 36
Survey Results to Instill a Culture that Values and Recognizes PBIS Interventions
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): The
district values and recognizes schools that apply PBIS
interventions.
1 Strongly agree* 24.7% 19
2 28.6% 22
3 22.1% 17
4 16.9% 13
5 Strongly disagree 7.7% 6
Total 100% 77
*Correct response
Survey results for school site value and recognition. Although the criteria for this item
was 80%, as shown in Table 37, only 68% of the participants indicated that their school sites
value and recognize those who apply PBIS interventions. Eighteen percent of the participants
remained neutral and 14% of the participants responded incorrectly, thus indicating the schools
are not supportive nor value school culture. These results demonstrated that improvement can be
made for PBIS committee members to instill a culture that values and recognizes PBIS
interventions implemented at each school.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 124
Table 37
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): My
school site values and recognizes those who apply PBIS
interventions.
1 Strongly agree* 31.2% 24
2 36.4% 28
3 18.2% 14
4 7.8% 6
5 Strongly disagree 6.4% 5
Total 100% 77
*Correct response
Survey results for district accountability. Although the criteria for this item was 80%, as
shown in Table 38, only 35% of the participants correctly indicated and agreed that the district
monitors and holds schools accountable for the implementation of PBIS interventions. Forty
three percent of the participants remained neutral and 22% of the participants disagreed with the
statement, thus indicating the district needs to monitor and hold schools accountable for
implementing PBIS interventions at their respective school sites. These results demonstrated
where the district is regarding their value to monitor and account for PBIS interventions
implemented at school sites.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 125
Table 38
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): The
district monitors and holds schools accountable for the
implementation of PBIS interventions.
1 Strongly agree* 14.3% 11
2 20.8% 16
3 42.9% 33
4 16.9% 13
5 Strongly disagree 5.1% 4
Total 100% 77
*Correct response
Survey results for school site accountability. Although the criteria for this item was
80%, as shown in Table 39, only 59% of the participants correctly indicated and agreed that their
school site monitors and holds the entire school accountable for the implementation of PBIS
interventions. Twenty six percent of the participants remained neutral and 16% of the
participants disagreed with the statement, thus demonstrating the schools culture of monitoring
and accountability. These results showed an improvement can be made for PBIS committee
members to create an environment where individuals value a culture of monitoring and
accountability.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 126
Table 39
Survey Results for Organization of PBIS Committee Members
# Organization Item % Count
Please indicate the extent to which you agree with the
statement (1- strongly agree to 5- strongly disagree): My
school site monitors and holds the entire school
accountable for the implementation of PBIS
interventions.
1 Strongly agree* 16.9% 13
2 41.6% 32
3 26.0% 20
4 10.4% 8
5 Strongly disagree 5.1% 4
Total 100% 77
*Correct response
Interview findings. Based on 24 interviews, there is not a clear agreement among the
participants being part of a culture that values monitoring and accountability of PBIS
interventions. Only 9 of the 24 the participants provided the expected response which only met
38% of the expected 80% criteria for interview findings. Participant 19 stated, “if I could I
would start all over and begin by laying a strong foundation for PBIS. I would have high
expectations and hold every school accountable…” Participant 5 indicated, “there are eight
different systems because no one created a system. There needs to be strong system in place
before you expect to hold schools accountable for the implementation of PBIS.” These results
demonstrated improvement can be made for PBIS committee members to create a culture that
values monitoring and accountability.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 127
Document analysis. Documents were not provided nor analyzed.
Observation analysis. Some of the participants referenced binders, posters that included
acronyms, behavior expectations for all students, and positive messages on the walls.
Summary. Survey results indicated that the district does not ensure PBIS interventions
are being implemented at each school site. Interview findings reinforced what was reported in
the surveys; however, observations provided a different point of view. Through observations, it
was evident there was a culture of PBIS at each school. Overall, survey and interview outcomes
indicated there is not a clear agreement among PBIS committee members reaching the
established 80% criteria for culture.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 128
CHAPTER FIVE: RECOMMENDATIONS AND EVALUATION
Purpose of the Study and Questions
The overall purpose of this study was to identify and close the performance gap relative
to the overall organizational goal of designing and implementing an evaluation system that will
monitor and ensure 100% compliance of PBIS interventions. This study was driven by two
inquiry questions:
1. What are the knowledge, motivation, and organizational needs necessary for the
SEESD’s PBIS committee members to achieve their goal of designing and implementing
a comprehensive evaluation system that will monitor and ensure 100% compliance of
PBIS interventions addressing its mission?
2. What are the recommended knowledge, motivation, and organizational solutions to those
needs?
Recommendations to Address Knowledge, Motivation,
and Organization Influences
The Clark and Estes’ (2008) Gap Analysis Process Model was utilized to identify the
goal and the gap between the desired outcome and current performance. Interviews, document
analysis, and observations helped identify potential causes of the gap. In addition, previous
research studies and a review of literature guided the construction of a survey and set of
questions to test potential causes of the gap. The gaps were identified through staff surveys and
interviews with PBIS committee members. This chapter will address Inquiry Question Two.
The purpose of this chapter is to discuss potential solutions that have been designed to
address the highest priority validated causes of SEESD’s performance gap. This chapter will
also provide recommendations for implementing these solutions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 129
Knowledge Recommendations
Introduction. The knowledge influences in Table 40 include all validated, assumed
knowledge influences, and their priority. The knowledge influences were validated based on the
most frequently mentioned factual, conceptual, procedural, and metacognitive knowledge
influences to achieve the established goal during semi-structured interviews, document analysis,
observations, supported by surveys and review of the literature. The conceptual framework for
this study was Clark and Estes’ (2008) gap analysis. The knowledge influences included the
conceptual dimension that addresses the what, the procedural dimension that addresses the how,
and the metacognitive dimension that addresses the self (Krathwohl, 2002). As indicated in
Table 40, these influences have a high priority for achieving the performance goal. Table 40 also
shows the recommendations for these influences based on theoretical principles.
Table 40
Summary of Knowledge Influences and Recommendations
Assumed Knowledge
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
Factual
PBIS committee members
need to know the established
goal to evaluate compliance of
PBIS interventions.
Low
PBIS committee members
need to know the components
of PBIS.
Low
PBIS committee members
need to know what comprises
an effective evaluation system
that can monitor and ensure
compliance of PBIS
interventions.
High When individuals use their
prior knowledge they
increase their learning
(Schraw & McCrudden,
2006).
PBIS committee members will
need examples or a clear idea
of what an effective
evaluation system entails.
Remind PBIS committee
members about their prior
knowledge on evaluation and
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 130
Table 40 (Cont’d.)
Assumed Knowledge
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
connect it with new
knowledge to identify
effective evaluation practices.
Conceptual
PBIS committee members
need to know various
categories of PBIS to measure
improvements and/or areas of
growth.
High How individuals organize
knowledge influences,
how they learn, and apply
what they know (Schraw
& McCrudden, 2006).
PBIS committee members will
organize PBIS assessment
tools into categories by
creating a list that will include
brief descriptions of what each
tool measures, website links to
access each assessment tool,
and an area to include login
information.
PBIS committee members
need to know the three-tiers of
PBIS supports.
Low
PBIS committee members
need to know the system of
PBIS.
Low
Procedural
PBIS committee members
need to know how to create an
evaluation system.
High Guidance, modeling, and
feedback are strategies to
improve self-efficacy,
learning, and performance
(Denler, Wolters, &
Benzon, 2009).
PBIS committee members will
attend trainings at the district
office where examples of what
an effective evaluation system
looks like will be discussed.
Through demonstration,
practice, and feedback, PBIS
committee members will
collaboratively determine
items to include in the
evaluation system.
Continued practice
promotes automaticity and
takes less capacity in
working memory (Schraw
& McCrudden, 2006).
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 131
Table 40 (Cont’d.)
Assumed Knowledge
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
To develop mastery,
individuals must acquire
component skills, practice
integrating them, and
know when to apply what
they have learned (Schraw
& McCrudden, 2006).
PBIS committee members
need to know how to analyze
the results of the PBIS
evaluation system.
High Clark and Estes (2008)
suggested that effective
training must be detailed
and sequenced as
performed on the job.
To develop mastery,
individuals must acquire
component skills, practice
integrating them, and
know when to apply what
they have learned (Schraw
& McCrudden, 2006).
PBIS committee members will
be provided with
demonstrations and feedback
as they practice correcting
performance mistakes.
PBIS committee members
need to be able to incorporate
PBIS interventions into the
schools.
Low
Metacognitive
PBIS committee members
need to know how to reflect on
their own abilities to create
their own comprehensive
evaluation system.
Low
PBIS committee members
need to reflect on and evaluate
their own performance.
High The use of metacognitive
strategies facilitates
learning (Baker, 2006).
PBIS committee members will
reflect and evaluate their own
performance by journaling on
a weekly basis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 132
Table 40 (Cont’d.)
Assumed Knowledge
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
As individuals increase
their metacognitive
awareness they become
self-regulated learners and
increase their ability to
determine what strategies
work for them and when it
is appropriate to use these
strategies (Mayer, 2011).
Journals will be discussed in
one-on-one meetings with the
PBIS coordinator.
Factual knowledge solutions. PBIS committee members need to know what comprises
an effective evaluation system that can monitor and ensure compliance of PBIS interventions.
Prior knowledge plays an essential role in the learning process by organizing and connecting new
knowledge to what has been stored in the long-term memory, hence expanding valuable space in
the working memory (Ambrose, Bridges, DiPietro, Lovett, & Newman, 2010; Mayer, 2011;
Rueda, 2011). Schraw and McCrudden (2006) discussed information processing theory and
suggested that when individuals use their prior knowledge they increase their learning. People
often have problems recognizing when it is appropriate to use past experience to handle a new
challenge (Clark & Estes 2008); however, the more individuals continue to reference prior
knowledge, the more they develop mastery. Schraw and McCrudden indicated once individuals
develop mastery, individuals begin to integrate knowledge and know when to apply what they
have learned. In this case, it is important PBIS committee members use examples of what an
effective evaluation system entails. Therefore, it is recommended for PBIS committee members
to use their prior knowledge on evaluation and connect it with new knowledge to identify
effective evaluation practices.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 133
Conceptual knowledge solutions. PBIS committee members need to know various
categories of PBIS to measure improvements and/or areas of growth. Krathwohl (2002) stated
conceptual knowledge allows the individual to use the interrelationship of their foundational
knowledge. As indicated by Schraw and McCrudden (2006), information learned meaningfully
and connected with prior knowledge is stored and remembered more accurately because it is
elaborated with prior learning. Mayer (2011) emphasized that learning is achieved by organizing
and rehearsing modeled behaviors, then enacting them overtly. Therefore, it is recommended for
PBIS committee members to organize PBIS assessment tools into categories by creating a list
that will include brief descriptions of what each tool measures, website links to access each
assessment tool, and an area to include login information. This will promote the “how to”
information without guided practice in order to decrease cognitive loads for individuals, thus
making it easier to transfer the knowledge to the workplace (Clark & Estes, 2008; Grossman &
Salas, 2011).
Procedural knowledge solutions. PBIS committee members need to know how to
create an evaluation system. Krathwohl (2002) suggested that procedural knowledge provides
individuals with the ability to decide how to use their skills and determine the appropriate time to
implement. Some of the strategies to improve learning and performance is through guidance,
modeling, and feedback (Denler et al., 2009). Therefore, it is recommended for PBIS committee
members to attend trainings at the district office where examples of what an effective evaluation
system looks like will be discussed. Through practice, demonstration, and feedback, PBIS
committee members will collaboratively determine items to include in the evaluation system.
Additionally, PBIS committee members need to know how to implement an evaluation
system. Clark and Estes (2008) suggested that training individuals with the “how to” knowledge
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 134
and providing guided practice and feedback helps individuals achieve their goal. Continued
practice also promotes automaticity and takes less capacity in working memory (Schraw &
McCrudden, 2006). To develop mastery, individuals must acquire component skills, practice
integrating them, and know when to apply what they have learned (Schraw & McCrudden,
2006). Therefore, it is recommended for PBIS committee members to be guided and provided
with opportunities to practice their skills. In addition, it is recommended for them to be provided
with feedback throughout their journey of developing an effective evaluation system.
Similarly, PBIS committee members need to know how to analyze the results of the PBIS
evaluation system. Clark and Estes (2008) suggested that effective training must be detailed and
sequenced as performed on the job. To develop mastery, individuals must acquire component
skills, practice integrating them, and know when to apply what they have learned (Schraw &
McCrudden, 2006). Therefore, it is recommended for PBIS committee members to be provided
with demonstrations and feedback as they practice correcting performance mistakes.
Metacognitive knowledge solutions. PBIS committee members need to reflect on and
evaluate their own performance. Metacognitive knowledge is important because it allows the
individual to assess their level of cognitive awareness (Krathwohl, 2002). Metacognition
provides opportunities for learners to engage in guided self-monitoring and self-assessment,
hence being able to debrief the thinking process upon completion of a task (Baker, 2006).
Correspondingly, as individuals increase their metacognitive awareness they become self-
regulated learners and increase their ability to determine what strategies work for them and when
it is appropriate to use these strategies (Mayer, 2011). Therefore, it is recommended for PBIS
committee members to reflect and evaluate their own performance by journaling on a weekly
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 135
basis. This exercise will increase self-awareness and facilitate the transfer of knowledge from
one context to another by discussing them during one-on-one meetings with PBIS coordinator.
Motivation Recommendations
Introduction. The motivation influences in Table 41 include all assumed motivation
influences and their priority. The motivation influences were validated based on the most
frequently mentioned motivational influences to achieving the established goal during semi-
structured interviews, document analysis, observations, supported by surveys, and review of the
literature. The motivation influences include expectancy, value, and self-efficacy. Clark and
Estes (2008) presented three facets of motivated performance: active choice, persistence, and
mental effort. Active choice takes place when an individual decides to pursue a goal; this does
not include the intention to start. Once the individual has chosen a particular goal, problems with
persistence can threaten an individual’s determination to work towards the established goal.
Lastly, choosing a goal and persisting to accomplish it must be combined with mental effort. In
addition, the confidence the individual has toward reaching the desired goal is an essential
component of how much value is applied to reaching the goal.
With this information in mind, the PBIS committee members’ motivation to value
designing and implementing a comprehensive evaluation system, and their belief that they can
was evaluated to determine gaps in their motivation. As indicated in Table 41, these influences
have a high priority for achieving the performance goal. Table 41 also shows the
recommendations for these influences based on theoretical principles.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 136
Table 41
Summary of Motivation Influences and Recommendations
Assumed Motivation
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
Value
PBIS committee members
need to value designing and
implementing a
comprehensive evaluation
system to monitor and ensure
compliance of PBIS
interventions.
High Motivation increases when
people see the value of the
task and, due to prior
successes, they have
confidence in their ability
(Eccles, 2006).
Learning and motivation
are enhanced when
individuals attribute
success or failures to effort
rather than ability.
(Anderman & Anderman,
2010).
PBIS committee members will
see the successes from school
districts that have successfully
implemented a comprehensive
evaluation system. This will
help them see the value of the
task.
Self-Efficacy
PBIS committee members
need to have confidence that
they can create an evaluation
system that will monitor and
ensure compliance of PBIS
interventions.
High Bandura (2000) suggested
that for a team to function
effectively is it critical for
the individuals to have
high efficacy.
Pajares (2006) found that
when individuals are able
to successfully master a
task, self efficacy
increases.
PBIS committee members will
have monthly meetings with
the PBIS coordinator and
receive corrective feedback
and positive encouragement
on current status.
Mood
PBIS committee members
need to feel positive about
designing and implementing a
comprehensive evaluation
system.
Low
Expectancy value. PBIS committee members need to value designing and implementing
a comprehensive evaluation system to monitor and ensure compliance of PBIS interventions.
Eccles (2006) indicated motivation increases when people see the value of the task and, due to
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 137
prior successes they have confidence in their ability. Anderman and Anderman (2010)
emphasized that learning and motivation are enhanced when individuals attribute success or
failures to effort rather than ability. Therefore, it is recommended for PBIS committee members
to see the successes from school districts that have successfully implemented a comprehensive
evaluation system. This will help them see the value of the task and build the committee’s
confidence.
Self-efficacy. PBIS committee members need to have confidence that they are capable
and can create an evaluation system that will monitor and ensure compliance of PBIS
interventions. Bandura (2000) suggested that for a team to function effectively is it critical for
the individuals to have high efficacy. High self-efficacy can positively influence motivation
(Pajares, 2006). Additionally, when individuals are able to successfully master a task, self-
efficacy increases (Pajares, 2006). Feedback and modeling also increases self-efficacy and
enhances motivation (Pajares, 2006). Therefore, PBIS committee members will have individual
monthly meetings with the PBIS coordinator and receive corrective feedback and positive
encouragement. As time progresses, the PBIS coordinator will gradually increase the challenge
in order to improve self-efficacy.
Organization Recommendations
Introduction. The organizational influences in Table 42 include all assumed
organizational influences and their priority. The organizational influences were validated based
on the most frequently mentioned organizational influences to achieve the established goal
during semi-structured interviews, document analysis, and observations supported by surveys
and review of the literature. Even when stakeholders have the knowledge and skills and are
motivated, Clark and Estes (2008) suggested that a lack of effective organizational resources,
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 138
policies and procedures, and cultural settings and models may prevent individuals from
achieving their performance goals. As stated by Rueda (2011), cultural settings are the visible
characteristics of the daily workings in an organization, while the cultural models are often
invisible and a shared mental representation of the organizations structures, values, practices, and
policies. The cultural setting influences include the organization's ability to provide time and
stakeholder capacity to create an evaluation system. The cultural model influences include the
organization’s ability to provide an environment that supports change, and includes an
environment that values PBIS interventions. As indicated in Table 42, these influences have a
high priority for achieving the performance goal. Table 42 also shows the recommendations for
these influences based on theoretical principles.
Table 42
Summary of Organization Influences and Recommendations
Assumed Organization
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
Cultural Settings
PBIS committee members need
time to create an effective
comprehensive evaluation system.
High Organizational
effectiveness increases
when leaders ensure that
employees have the
resources needed to
achieve the
organization’s goals
(Clark & Estes, 2008).
PBIS committee members will
be provided opportunities to
meet on a weekly and monthly
basis. This will allow staff
and administrators the
opportunity to communicate
their plans, progress, and
support for one another.
PBIS committee members need
individuals to commit to design,
implement, and monitor an
effective comprehensive
evaluation system.
High Organizational
effectiveness decreases if
the workload increases
more than 10% when
adopting change (Sirkin,
Keenan, & Jackson,
2005).
PBIS committee members will
review the current demands,
workload, and resources to
collaboratively develop a plan
to redesign the work process.
Policies and Procedures
PBIS committee members need
to have policies that align with
the goal of the SEESD.
Low
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 139
Table 42 (Cont’d.)
Assumed Organization
Influence
Priority
High
Low
Principle and
Citation
Context-Specific
Recommendation
Cultural Models
PBIS committee members need
to be part of a culture that
supports PBIS interventions that
align with the mission of
investing in optimal learning
environments that enhance
safety and create positive school
climates.
High Organizational
performance increases
when the organization
has a vision, goals, and
ways to measure progress
(Clark & Estes, 2008).
PBIS committee members will
be asked to provide monthly
updates on assessment
outcomes to track data and
create a culture of
accountability. The outcome
data will then help the PBIS
committee members identify
areas of growth that connect
the vision and goals.
PBIS committee members need
to be part of culture that values
monitoring and accountability,
ensuring that PBIS interventions
are being implemented.
High Organizational
performance increases
and trust is promoted
when individuals and
leaders communicate
openly and constantly
about plans and progress
(Clark & Estes, 2008).
PBIS committee members will
be asked to provide updates
during scheduled meetings at
the district office to allow
staff and administration the
opportunity to openly
communicate their plans,
progress, and support for one
another to create a culture of
accountability.
Cultural settings. PBIS committee members need time to create an effective
comprehensive evaluation system. In addition, PBIS committee members need individuals to
commit to design, implement, and monitor an effective comprehensive evaluation system. When
a change is implemented and the workload increases, there will be a decrease in effectiveness
(Sirkin et al., 2005). Comparatively, Clark and Estes (2008) indicated that organizational
effectiveness increases when leaders ensure employees have the resources needed to achieve the
organization’s established goals. Therefore, PBIS committee members will review the current
work demands and resources to collaboratively develop a plan to redesign the work process.
After a thorough review of professional development days, leadership team meetings, and
school site team meetings, the PBIS coordinator will consult with the Superintendent regarding
existing opportunities for PBIS committee members to meet on a weekly and monthly basis.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 140
Consulting with the Superintendent will prevent potential conflicts with the collective bargaining
agreement that governs the use of time for union-based employees. For instance, a way to get
release time for teachers to attend PBIS committee meetings during the school day without using
substitutes will be to utilize teachers within the school to provide release time for PBIS
committee members to meet and plan. This will also alleviate potential time pressure/overload
for individuals working on various initiatives and projects. For example, collaboration amongst
departments will allow each department to share ideas and information before scheduled PBIS
committee meetings, thus reducing time barriers. Essentially, the support of the Superintendent
will allow staff and administrators the opportunity to communicate their plans, progress, and
support for one another. In addition, it will allow for a redesigned work process to be created
which will include more time to focus on the creation of a comprehensive evaluation system.
Cultural models. PBIS committee members need to be part of a culture that supports
PBIS interventions that align with the mission of investing in optimal learning environments that
enhance safety and create positive school climates. Clark and Estes (2008) suggested
organizational performance increases when the organization has a vision, goals, and ways to
measure progress. Therein, the support of the Superintendent will be essential in motivating the
development of creating a comprehensive evaluation system. When the Superintendent begins to
recognize individual schools for their efforts based on data gathered by the evaluation system,
individuals will begin to value and believe in their work. Therefore, PBIS committee members
will be asked to provide monthly updates on assessment outcomes to track data and create a
culture of accountability. The outcome data will then help the PBIS committee members
identify areas of growth that connect the vision and goals of the organization.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 141
In addition, PBIS committee members need to be part of a culture that values monitoring
and accountability, hence ensuring PBIS interventions are implemented. Clark and Estes (2008)
indicated organizational performance increases and trust is promoted when individuals and
leaders communicate openly and constantly about plans and progress. Therefore, PBIS
committee members will be asked to provide updates during scheduled meetings at the district
office to allow staff and administration the opportunity to openly communicate their plans,
progress, and support for one another to create a culture of accountability.
Integrated Implementation and Evaluation Plan
Implementation and Evaluation Framework
The model that informed this implementation and evaluation plan was the New World
Kirkpatrick Model (Kirkpatrick & Kirkpatrick, 2016) based on the original Kirkpatrick Four
Level Model of Evaluation (Kirkpatrick & Kirkpatrick, 2006). This model suggested that
evaluation plans start with the goals of the organization and work backwards and that, by doing
so, the “leading indicators” that bridge recommended solutions to the organization’s goals are
both easier to identify and more closely aligned with organizational goals. Further, this model
recommended that the four levels of training and evaluation be planned in “reverse order”
starting with Level 4 (Results), Level 3 (Behavior), Level 2 (Learning), and Level 1 (Reaction).
This model suggested that Level 4 measures the results of the targeted outcomes by using
leading indicators to ensure that critical behaviors are on track to achieve the desired results.
Through Level 3 the organization can evaluate how much the individuals transfer what they have
learned in training once they are back in the department. Level 3 consists of critical behaviors,
required drivers, and on-the-job learning (Kirkpatrick and Kirkpatrick, 2016). The critical
behaviors are the key behaviors that the individuals must consistently be able to perform and
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 142
required drivers are ways to monitor, encourage, reinforce, and reward the continued use of the
critical behaviors. With Level 2, participants are evaluated on the degree of the knowledge and
skills, attitude, confidence, and commitment they learned from training. Finally, with Level 1
the organization can evaluate the participants’ reaction to training, such as satisfaction,
engagement, and relevance (Kirkpatrick and Kirkpatrick, 2016). Creating the implementation
and evaluation framework using this model requires the organizational goal to be integrated with
the recommendations for solutions and to increase the support needed to successfully implement
change (Kirkpatrick and Kirkpatrick, 2016).
Organizational Purpose, Need, and Expectations
The integration of the knowledge and motivation solutions were created to improve
learning and develop interest in the implementation of a comprehensive evaluation system that
will monitor and ensure 100% compliance of PBIS interventions. On the other hand,
organizational solutions focus on the work environment that may hinder the performance and/or
outcome of PBIS implementations. When distractions and/or issues in the work environment are
addressed, it is more likely for employees to focus their attention and energy on the established
goal.
In order to fulfil SEESD’s mission to raise the academic achievement bar and close
learning gaps for all students, is it important for the district to continue to implement PBIS and
create an evaluation system that measures the effectiveness of the PBIS framework. As such,
SEESD’s established goal was to design and implement an evaluation system to monitor the
effectiveness of all PBIS interventions to ensure 100% compliance addressing its mission to raise
the bar and close learning gaps for all students, and to ensure a positive school climate across the
district.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 143
The goal of the SEESD was to design and implement a comprehensive evaluation system
that will monitor and ensure 100% compliance of PBIS interventions addressing its mission by
June 2018. The PBIS committee members focus was to evaluate the implementation of PBIS at
each school site and make intervention recommendations for each school by August 2018. PBIS
committee members play a significant role in the design and implementation of a comprehensive
evaluation system. This study examined the knowledge and skills, motivational, and
organizational barriers that affect PBIS committee members’ abilities to design and implement a
comprehensive evaluation system. The proposed solution was to provide training, job aids, and
one-on-one support. The proposed solutions for internal outcomes will allow PBIS committee
members to (a) decrease office discipline referrals reported to the school; (b) decrease referrals
for suspension; (c) decrease number of expulsions; and (d) increase attendance rates. The
proposed solution for external outcomes was to (a) decrease office discipline referrals reported to
the district; (b) decrease referrals for suspension reported to the district; (c) decrease expulsions
reported to the district; (d) increase attendance rates; (e) increase positive reactions with the
board of education members; and (f) increase community impressions. In turn, improving
attitudes and values, and overall school climate.
Level 4: Results and Leading Indicators
Table 43 shows the proposed Level 4: Results and Leading Indicators in the form of
outcomes, metrics, and methods for both external and internal outcomes for SEESD. If the
internal outcomes are met as expected as a result of the training and organizational support, the
external outcomes should also be realized.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 144
Table 43
Outcomes, Metrics, and Methods for External and Internal Outcomes
Outcome Metric(s) Method(s)
External Outcomes
Decrease office discipline
referrals reported to the district.
Number of office discipline referrals
submitted by staff members.
Monthly District reports
Decrease referrals for
suspension reported to the
district.
Number of suspension referrals
submitted by the principals.
Monthly District reports and County-
wide reports
Decrease expulsions reported
to the district.
Number of suspension referrals
submitted by the principals.
Monthly District reports and County-
wide reports
Increase attendance rates
district-wide.
Number of school attendance (district-
wide) percentages.
Monthly District reports and County-
wide reports
Increase positive reactions with
the board of education
members.
Number of positive comments/
statements made at board meetings.
Comments made at monthly board
meetings.
Increase community
impressions.
Number of positive comments/
statements made by parents and
members of the community.
Daily comments made on social media
platforms and school district
application.
Monthly board meetings.
Internal Outcomes
Decrease office discipline
referrals reported to the school.
Number of office discipline referrals
submitted by staff members.
Monthly report form the principal’s
office
Feedback during staff meetings
Decrease referrals for
suspension.
Number of suspension referrals
submitted by the principal.
Monthly report from the principal’s
office
Decrease number of
expulsions.
Number of suspension referrals
submitted by the principal.
Monthly report from the principal’s
office
Increase attendance rates Number of individuals, classrooms,
grade levels, and school attendance.
Monthly report from the principal’s
office
Feedback during staff meetings
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 145
Level 3: Behavior
Critical behaviors. The stakeholders of focus are the new reviewers completing the
PBIS workshop. The first critical behavior is that new reviewers must create a comprehensive
evaluation system. The second critical behavior is that they must implement the evaluation
system. The third critical behavior is that they must analyze data produced by the evaluation
system and complete the review on or before the deadline. The specific metrics, methods, and
timing for each of these outcome behaviors appear in Table 44.
Table 44
Critical Behaviors, Metrics, Methods, and Timing for Evaluation
Critical Behavior Metric(s) Method(s) Timing
1. PBIS committee
members will create a
comprehensive evaluation
system.
Number of completed
sections of each listed item
on the working document
of the evaluation system.
Working document
submitted to the district
office and PBIS
coordinator.
Every week
2. PBIS committee
members will implement
the evaluation system.
Amount of completed data
collected for each section
of the evaluation system.
Data for each section will
be submitted to the
district office, PBIS
coordinator, and
principal’s office.
Every week
3. PBIS committee
members will analyze
data produced by the
evaluation system.
Amount of data results
within each section of the
evaluation system to verify
data submission/
completeness.
Data for each section will
be submitted to the district
office, PBIS coordinator,
and principal’s office.
Data will be monitored
and discussed during
PBIS committee
meetings.
Every week
Required drivers. New PBIS committee members require the support of the PBIS
coordinator and the organization to reinforce what they learn in the workshop and to encourage
them to apply what they have learned to review accurately and on time. Rewards should be
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 146
established for achievement of performance goals to enhance the organizational support of new
reviewers. Table 45 shows the recommended drivers to support critical behaviors of new
reviewers.
Table 45
Required Drivers to Support Critical Behaviors
Method(s)
Timing
Critical Behaviors
Supported
1, 2, 3 Etc.
Reinforcing
PBIS coordinator will request
for members of the PBIS
committee to attend refresher
trainings based on individual
meetings.
Monthly 1, 2, 3
PBIS coordinator will supply
job aids for how to complete
sections of the evaluation
system.
Ongoing 1, 2, 3
PBIS coordinator will set
reminders for PBIS
committee members to submit
weekly data and attend
weekly meetings.
Ongoing 1, 2, 3
Encouraging
PBIS coordinator will meet
with PBIS committee
members at each individual
school to coach them and
facilitate meetings/
discussions.
Ongoing 1, 2, 3
Superintendent will encourage
PBIS committee members to
participate in the creation,
implementation, and analysis
of a comprehensive
evaluation system.
Ongoing 1, 2, 3
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 147
Table 45 (Cont’d.)
Method(s)
Timing
Critical Behaviors
Supported
1, 2, 3 Etc.
Rewarding
PBIS coordinator will
publicly acknowledge PBIS
committee members’ progress
and successes in creating a
comprehensive evaluation
system.
Monthly 1, 2, 3
Superintendent will publicly
acknowledge PBIS committee
members via newsletters and
board meetings for their
progress and successes in
creating a comprehensive
evaluation system.
Quarterly 1, 2, 3
Monitoring
PBIS coordinator will create a
system consisting of
processes for requesting,
processing, and monitoring
data for the PBIS evaluation
system. The principals will
track data submitted using a
checklist of required data.
PBIS committee members
will account for their progress
in analyzing data during
meetings. Social workers will
track PBIS committee
members’ progress.
Monthly 1, 2, 3
Organizational support. Four strategies could be used to ensure that the Required
Drivers occur: (a) PBIS coordinator can create opportunities to meet on a weekly and monthly
basis; (This would allow staff and administrators the opportunity to communicate their plans,
progress, and support for one another); (b) the PBIS coordinator will review the current
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 148
demands, workload, and resources to collaboratively develop a plan to redesign the work
process; (c) the PBIS committee members will provide monthly updates on assessment outcomes
to track data and create a culture of accountability; (The outcome data will then help the PBIS
committee members identify areas of growth that connect the vision and goals); and (d) the PBIS
committee members will provide updates during scheduled meetings at the district office to
allow staff and administration the opportunity to openly communicate their plans, progress, and
support for one another to create a culture of accountability.
Level 2: Learning
Learning goals. Following completion of the recommended solutions, most notably the
PBIS workshop, the stakeholders will be able to:
1. Recognize what comprises an effective evaluation system that can monitor and ensure
compliance of PBIS interventions.
2. Recognize the various categories of PBIS to measure improvements and/or areas of
growth.
3. Create a comprehensive evaluation system.
4. Implement a comprehensive evaluation system.
5. Analyze the results of the PBIS evaluation system.
6. Reflect and evaluate their own performance.
7. Value the design and implementation of their work.
8. Demonstrate confidence that they can create an evaluation system.
Program. The learning goals listed in the previous section will be achieved through
training and exercises that will increase the knowledge and motivation of the learners and PBIS
coordinator. PBIS committee members will receive a refresher course to re-learn and re-
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 149
emphasize the goal of PBIS. The refresher course will take place in-person, however, there will
be an option to participate online via Adobe Connect. The refresher course will also be recorded
for those who are unable to attend in-person or online and/or for those who would like to review
or gain clarity on particular topics. Then, they will learn what comprises an effective evaluation
system that can monitor and ensure compliance of PBIS interventions. The group will watch
videos, answer questions regarding the videos, have group discussions, and create visuals to
express their understanding and application of prior knowledge. In addition, PBIS committee
members will learn a wide range of evaluation approaches to measure improvements and/or
areas of growth by browsing the databases presented to them. After exploring a variety of
website links, they will get in groups and create a list that will include brief descriptions of what
each tool measures, website links to access each assessment tool, and an area to include login
information.
Examples of effective evaluation systems will be demonstrated and discussed. Then,
PBIS committee members will be provided with case scenarios where they will collaboratively
determine the items to include in the evaluation system. PBIS committee members will be
guided through their thinking process by being asked open-ended questions. Together, the PBIS
committee members will have the opportunity to practice their skills as they develop an
evaluation system. They will be provided with demonstrations and feedback as they practice
correcting performance mistakes. Lastly, PBIS committee members will be provided with a
PBIS passion planner/journal where they will be able to include questions or ideas they may
have. In their PBIS passion planner/journal, committee members will have a space to reflect on
their approaches and evaluate their own performance. Planners/journals will be discussed in
one-on-one meetings with the PBIS coordinator.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 150
Furthermore, PBIS committee members will listen to two guest speakers from different
school districts who have successfully implemented a comprehensive evaluation system. This
will help them see the value of the task. Lastly, committee members will schedule monthly
meetings with the PBIS coordinator and receive corrective feedback and positive encouragement
on current status.
Evaluation of the components of learning. Demonstrating declarative knowledge is
often necessary as a precursor to applying the knowledge to solve problems. Thus, it is
important to evaluate learning for both declarative and procedural knowledge being taught. It is
also important that learners value the training as a prerequisite to using their newly learned
knowledge and skills on the job. However, they must also be confident that they can succeed in
applying their knowledge and skills and be committed to using them on the job. As such, Table
46 lists the evaluation methods and timing for these components of learning.
Table 46
Evaluation of the Components of Learning for the Program
Method(s) or Activity(ies) Timing
Declarative Knowledge “I know it.”
Complete pre- and post-test to measure
knowledge.
Before, during, after the workshop
Knowledge checks using multiple-choice
questionnaires.
Before and after the workshop
Procedural Skills “I can do it right now.”
Feedback from peers during meeting. During
Use case scenarios and multiple-choice
question items.
During and after the workshop
Demonstration of the use of job aids to
perform a task.
Before, during, and after the workshop
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 151
Table 46 (Cont’d.)
Method(s) or Activity(ies) Timing
Complete pre- and post-test to measure
knowledge.
Before, during, and after the workshop
Attitude “I believe this is worthwhile.”
PBIS coordinator’s observation of
participants’ statements and actions
demonstrating they see the benefit of what
they are being asked to do on the job.
During
Open discussion of the value of what they are
being asked to do.
During
Complete pre- and post-test to measure
attitude.
Before, during, and after the workshop
Confidence “I think I can do it on the job.”
Discussions following practice and feedback.
During and after the workshop
Complete pre- and post-test to measure
confidence.
Before, during, and after the workshop
Commitment “I will do it on the job.”
One-on-one meetings with PBIS committee
members.
After the workshop
Ask open-ended questions and have them
write or verbally share in front of the group.
During
Complete pre- and post-test to measure
commitment.
Before, during, and after the workshop
Level 1: Reaction
It is important to determine how the participants react to the PBIS workshop. Thus, it is
essential to confirm that the quality of the learning event was acceptable by the participants. As
such, Table 47 lists the reactions of the participants to the learning event being favorable,
engaging, and relevant.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 152
Table 47
Components to Measure Reactions to the Program
Method(s) or Tool(s) Timing
Engagement
Attendance
During
Completion of lessons/units
During
Workshop evaluation At the end of the workshop
Relevance
Open discussion of the relevance of what they
are being asked to do.
During
Workshop evaluation At the end of the workshop
Customer Satisfaction
Open discussion of the satisfaction
During
Workshop evaluation At the end of the workshop
Evaluation Tools
Immediately following the program implementation. Following the PBIS workshop
the PBIS committee members will complete a 12-item questionnaire using a Likert scale (see
Appendix A). Two questions on the evaluation will be open-ended. For level 1, during the PBIS
workshop, the PBIS coordinator will ask about the relevance of the content to their work and the
organization, delivery, and learning environment. Level 2 will include checks for understanding
utilizing case scenarios and creating friendly competitions among groups when responding to
questions and scenarios drawn from the content.
Delayed for a period after the program implementation. Approximately six weeks
after the PBIS workshop, the organization will administer a questionnaire containing scaled and
open-ended items to measure satisfaction and relevance of the training (Level 1), confidence and
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 153
value of applying their training (Level 2), application of the training to creating a comprehensive
evaluation plan (Level 3), and the extent to which their performance has become more efficient
both internally and externally (Level 4).
Data Analysis and Reporting
The Level 4 goal of decreasing office discipline referrals, suspensions, expulsions, and
attendance will be analyzed to determine if progress has been made towards achieving the
organizational goal. This is when the organization will determine if the solutions offered helped
to close the gap. Every month, the PBIS coordinator will track the number of reports processed
and will formulate responses to provide to everyone (as a school and as a district) via email.
Limitations
It is important to note potential limitations of the research study, including possible
threats to internal and external validity. One limitation of the study is the reliance of the PBIS
committee members’ self-report measures, especially if they are knowledgeable about PBIS
interventions and evaluation systems. Although the surveys were distributed via email, it is
possible that PBIS committee members or school administrators could have influenced
participants’ responses on the survey through informal conversations. Another limitation of this
study was the instrumentation used and sample sizes. The survey items were generated by the
researcher and were untested instruments. The results were used to make associations between
organizational features using a small sample size relative to other research studies mentioned in
the literature. Lastly, the design intended to yield rich data and recommendations for the
SEESD; however, more time would be needed to interview and observe more individuals and
schools. This would have allowed further analysis, comparisons, and interpretations to be made
in order to make strong assertions.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 154
Future Research
The present study is an initial effort to understand district-wide knowledge, motivation,
and organizational needs in order to design and implement a comprehensive evaluation system
that will monitor and ensure 100% compliance of PBIS interventions. The findings of the
current study have important implications that can influence or enhance the creation of a
comprehensive evaluation system. Future research could focus on measuring effectiveness on a
variety of domains, such as academic, behavioral, social emotional learning, and PBIS
interventions. The newly created evaluation system could examine systemic issues that may
confirm lack of readiness and commitment from participating schools or identify areas for
increased assistance. Future research could also assess perceptions of levels of implementation
and professional development needs related to PBIS implementation. In addition, future research
could focus on accountability for student success by developing high quality PBIS plans and
integrating innovative approaches, such as Multi-tiered Systems and Supports (MTSS). This
would establish data-driven decision making processes and evaluation at SEESD to determine
100% compliance of PBIS interventions.
Conclusion
Adapting the Clark and Estes’ (2008) Gap Analysis framework as an evaluation, this
study sought to assess the areas of knowledge and skill, motivation, and organizational resources
necessary to reach the organizational performance goal of developing a comprehensive
evaluation system. By identifying the challenges that contribute to the gap in accomplishing the
organizational goal, a mixed-method approach was used to collect data to validate and prioritize
assumed causes in knowledge and skills, motivation, and organizational factors. The findings
revealed root causes related to creating a comprehensive evaluation system, implementing an
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 155
evaluation system, and analyzing the data produced by the evaluation system. Proposed
solutions were aimed at strengthening organizational supports, fostering commitment and value,
and creating a strong knowledge base for capacity building. Essentially, this study demonstrated
how stakeholders could systematically apply the Gap Analysis framework to address
performance issues when implementing initiatives that will monitor and ensure compliance of
interventions.
The New World Kirkpatrick Model informed the implementation and evaluation plan of
this study (Kirkpatrick & Kirkpatrick, 2016). The four levels of training and evaluation were
used to ensure that PBIS committee members have the knowledge, motivation, and
organizational support to provide data on perceived barriers to developing a comprehensive
evaluation system that will monitor and ensure 100% compliance of PBIS interventions. With
this model, the training begins with the identification of outcomes, metrics and method to
measure the results of the targeted outcomes that are integrated with the organization’s goals.
Next, the training establishes the critical behaviors to assess if the PBIS committee members are
utilizing what they have learned once they are back at their school sites. Furthermore, learning
outcomes are identified and the PBIS committee members are evaluated on their learning and
knowledge, attitude, commitment, and confidence during the PBIS workshop. Finally, methods
to assess how the PBIS committee members are reacting to training were developed to determine
satisfaction, engagement, and the relevance of the training. To implement change and maximize
the program results, it is important to evaluate and analyze data collection during program
implementation.
During training when the level of reaction and learning does not meet expectations, the
trainer needs to identify the issue and changes need to be made to the program. If the PBIS
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 156
committee members are not learning or reacting as expected it is recommended that the trainer
asks them their thoughts and address the issues that are raised (Kirkpatrick & Kirkpatrick, 2016).
When the level of reaction and learning meets expectations, the trainer may want to stop and
discuss what increased engagement (Kirkpatrick & Kirkpatrick, 2016).
When the level of behavior and results does not meet expectations, it is important to
communicate with the PBIS committee members to identify the issues with the required drivers
and critical behaviors (Level 3) that are not being applied. In addition, asking why the leading
indicators and desired results (Level 4) are not moving forward (Kirkpatrick & Kirkpatrick,
2016). The trainer can solicit feedback through surveys or interviews and ask the PBIS
committee members what behaviors would allow them to move forward to achieve their
performance goals. When the levels of behavior and results meet expectations it is
recommended that participants that are high achieving employees be recruited to identify what
they are doing to increase their performance and share with the organization (Kirkpatrick &
Kirkpatrick, 2016).
Finally, it is important to provide a final report on the training outcomes to the PBIS
committee members and the leadership team. Organizational support is a component that also
determines the success of any training program (Kirkpatrick & Kirkpatrick, 2016). To drive
performance and results Kirkpatrick and Kirkpatrick (2016) recommend providing reports and
creating touch points throughout the implementation process. To engage the leadership team in
topics that are important to them for evaluation the reports should address the relevance,
credibility, compelling, and efficiency of the program (Kirkpatrick & Kirkpatrick, 2016).
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 157
References
Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How
learning works. San Francisco, CA: Jossey-Bass.
Anderman, E. M., & Anderman, L. H. (2010). Classroom motivation. New York, NY: Pearson.
Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching, and
assessing: A revision of Bloom's Taxonomy of educational objectives. New York, NY:
Longman.
Baker, L. (2006). Metacognition. Retrieved from http://www.education.com/reference/article/
metacognition/
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: Freeman.
Bandura, A. (2000). Exercise of human agency through collective efficacy. Current Directions in
Psychological Science, 9(3), 75-78.
Barrett, S. B., Bradshaw, C. P., & Lewis-Palmer, T. (2008). Maryland statewide PBIS initiative:
Systems, evaluation, and next steps. Journal of Positive Behavior Interventions, 10(2),
105-114. doi:10.1177/1098300707312541
Brinamen, C., & Page, F. (2012). Using relationships to heal trauma: Reflective practices
creates a therapeutic preschool. Young Children, 67(5), 40.
California Department of Education (CDE. (2017, February 24). Educational demographics unit:
District enrollment report. Retrieved from http://dq.cde.ca.gov/dataquest.asp
Christofferson, R. D., & Callahan, K. (2015). Positive behavior support in schools (PBSIS): An
administrative perspective on the implementation of a comprehensive school-wide
intervention in an urban charter school. Education Leadership Review of Doctoral
Research, 2(2), 35-49.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 158
Clark, R. E., & Estes, F. (2008). Turning research into results: A guide to selecting the right
performance solutions. Charlotte, NC: Information Age Publishing.
Coffey J. H., & Horner, R. H. (2012). The sustainability of schoolwide positive behavior
interventions and supports. Exceptional Children, 78(4), 407-422. doi:10.1177/00
1440291207800402
Cohen, R., Kincaid, D., & Childs, K. E. (2007). Measuring school-wide positive behavior
support implementation: Development and validation of the benchmarks of quality.
Journal of Positive Behavior Interventions, 9(4), 203-213.
Denler, H., Wolters, C., & Benzon, M. (2009). Social cognitive theory. Retrieved from
https://project542.weebly.com/uploads/1/7/1/0/17108470/social_cognitive_theory__educ
ation.com.pdf
Dods, J. (2013). Enhancing understanding of the nature of supportive school-based relationships
for youth who have experienced trauma. Canadian Journal of Education, 36(1), 71-95.
Eccles, J. (2006). Expectancy value motivational theory. Retrieved from http://www.education
.com/reference/article/expectancy-value-motivationaltheory/
Fink, A. (2017). How to conduct surveys: A step-by-step guide. Los Angeles, CA: Sage
Publications.
Fitzgerald, M. M., & Cohen, J. A. (2012). Trauma-focused cognitive behavior therapy for school
psychologists. Journal of Applied School Psychology, 28(3), 294-315.
Freeman, R., Miller, D., & Newcomer, L. (2015). Integration of academic and behavioral MTSS
at the district level using implementation science. Learning Disabilities: A Contemporary
Journal, 13(1), 59-72.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 159
Gallimore, R., & Goldenberg, C. (2001). Analyzing cultural models and settings to connect
minority achievement and school improvement research. Educational Psychologist,
36(1), 45-56.
George, H. P., & Childs, K. E. (2012). Evaluating implementation of schoolwide behavior
support: Are we doing it well? Preventing School Failure, 56(4), 197-206.
Grossman, R., & Salas, E. (2011). The transfer of training: What really matters. International
Journal of Training and Development, 15(2), 103-120.
Horner, R. H., Kincaid, D., Sugai, G., Lewis, T., Eber, L., Barrett, S., & Johnson, N. (2013).
Scaling up school-wide positive behavioral interventions and supports. Journal of
Positive Behavior Interventions, 16(4), 197-208. doi: 10.1177/1098300713503685
Hoyle, C. G., Marshall, K. J., & Yell, M. L. (2011). Positive behavior supports: Tier 2
interventions in middle schools. Preventing School Failure, 55(3), 164-170.
Kirkpatrick, J., & Kirkpatrick, W. (2016). Kirkpatrick's four levels of training evaluation.
Alexandria, VA: ADT Press.
Krathwohl, D. R. (2002). A revision of Bloom’s Taxonomy: An overview. Theory Into Practice,
41(4), 212-218.
Lewis, T. J., & Sugai, G. (1999). Effective behavior support: A systems approach to proactive
school-wide management. Focus on Exceptional Children, 31, 1-17.
Los Angeles County Office of Education. (2017a, February 26). Student Services: Positive
behavior interventions and supports. Retrieved from http://www.lacoe.edu/Student
Services/PositiveBehaviorInterventionsandSupportPBIS.aspx
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 160
Los Angeles County Office of Education. (2017b, March 13). School improvement: Local
control and accountability. Retrieved from http://www.lacoe.edu/SchoolImprovement
/LCA P.aspx
Mayer, R. E. (2011). Applying the science of learning. Boston, MA: Pearson Education.
Maxwell, J. A., (2013). Qualitative research design: An interactive approach. Thousand Oaks,
CA: Sage Publications
McIntosh, K., Mercer, S. H., Nese, R. T., Strickland-Cohen, M. K., & Hoselton, R. (2016).
Predictors of sustained implementation of school-wide positive behavioral interventions
and supports. Journal of Positive Behavior Interventions, 18(4), 209-218.
Merriam, S. B., & Tisdell, E. J. (2016). Qualitative research: A guide to design and
implementation. San Francisco, CA: Jossey-Bass.
Newcomer, L. L., Freeman, R., & Barrett, S. (2013). Essential systems for sustainable
implementation of Tier 2 supports. Journal of Applied School Psychology, 29(2), 126-
147.
O'Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation
and its relationship to outcomes in K-12 curriculum intervention research. Review of
Educational Research, 78(1), 33-84. doi:10.3102/0034654307313793
Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research,
66(4), 543-578.
Pajares, F. (2006). Self-efficacy theory. Retrieved from http://www.education.com/reference/
article/self-efficacy-theory/
Patton, M. Q. (2002). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks, CA:
Sage Publications.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 161
Pintrich, P. R. (2003). A motivational science perspective on the role of student motivation in
learning and teaching contexts. Journal of Educational Psychology, 95(4), 667-686.
Positive Behavioral Interventions and Supports. (2017a, February 24). PBIS and the law.
Retrieved from https://www.pbis.org/school/pbis-and-the-law
Positive Behavioral Interventions and Supports. (2017b, March 25). District level PBIS.
Retrieved from https://www.pbis.org/school/district-level
Rueda, R. (2011). The 3 dimensions of improving student performance: Finding the right
solutions to the right problems. New York, NY: Teachers College Press.
Schraw, G., & McCrudden, M. (2006). Information processing theory. Retrieved from
http:// www.education.com/reference/article/information-processing-theory/
Sirkin, H. L., Keenan, P., & Jackson, A. (2005). Harvard Big Review, 83(10), 108-118.
Social Emotional Elementary School District. (2016). Social Emotional Elementary School
District parent handbook. (further information withheld for confidentiality).
Sprague, J., Nishioka, V., & Smith, S. G. (2007). Safe schools, positive behavior supports, and
mental health supports: Lessons learned from three safe schools/healthy students
communities. Journal of School Violence, 6(2), 93-115.
Ziomek-Daigle, J., Goodman-Scott, E., Cavin, J., & Donohue, P. (2016). Integrating a
multi-tiered system of supports with comprehensive school counseling programs.
Professional Counselor, 6(3), 220-232.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 162
Appendix A
Training Evaluation Form
Date(s) of Training Title of Training
Trainer Name(s) Location(s) of Training
Instructions: Please indicate your level of agreement with the statements listed below #1-10.
Strongly
Agree
Agree
Neutral
Disagree
Strongly
Disagree
1. The objectives of the training
were clearly defined.
2. Participation and interaction
were encouraged.
3. The topics covered were
relevant to me.
4. The content was organized
and easy to follow.
5. The materials distributed
were helpful.
6. This training experience will
be useful in my work.
7. The trainer(s) was/were
knowledgeable about the
training topics.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 163
8. The training was well
prepared.
9. The time allotted for the
training was sufficient.
10. The meeting room and
facilities were adequate and
comfortable.
11. What did you like most about this training?
12. What aspects of the training could be improved?
We appreciate your feedback!
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 164
Appendix B
Survey Items
Knowledge
1. Multiple choice. Complete the sentence.
OR
2. Which statement below is true?
SEESD’s goal for the evaluation of PBIS is to…
a) Create an evaluation system that measures compliance
and effectiveness of PBIS interventions.
b) Build on the evaluation system that is already in place.
c) Recreate an evaluation system that measures compliance
and effectiveness of PBIS interventions.
d) Merge evaluation systems into one system that measures
compliance and effectiveness of PBIS interventions.
3. Multiple choice.
Identify what the PBIS components are…
a) Establish expectations, teach expectations, reinforce
expectations, correct behavior
b) Intensive, Individual Interventions, Targeted Group
Interventions, Universal Interventions
c) Training, coaching, evaluating
d) Individual student systems, Classroom setting systems, School-
wide systems, District-level systems
4. Multiple choice.
An effective evaluation system is comprised of...
a) Fidelity of implementation practices, Team self-assessments,
Walk-through observation tools, Performance evaluations
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 165
b) PowerSchool student information system, Team
self-assessments, Walk-through observation tools,
Performance evaluations
c) PowerSchool student information system, School-wide
Information System (SWIS), Walk-through observation
tools, Performance evaluations
d) PowerSchool student information system, Attendance
data, Suspension data, Expulsion data, Mental Health
service data, School-wide Information System (SWIS), Self
Assessment Survey (SAS), Tiered Fidelity Inventory (TFI)
5. Multiple choice. Complete the sentence.
____________ are the four categories to consider when measuring areas
of growth.
a) Fidelity of implementation practices, Team self-assessments,
Walk-through observation tools, and Performance evaluations
b) PowerSchool student information system, Team
self-assessments, Walk-through observation tools,
And Performance evaluations
c) PowerSchool student information system, School-wide
Information System (SWIS), Walk-through observation
tools, Performance evaluations
d) PowerSchool student information system, School-wide
Information System (SWIS), Self Assessment Survey
(SAS), and Tiered Fidelity Inventory (TFI)
6. Multiple choice. Choose the best option to complete the sentence.
PBIS focuses on creating and sustaining…
a) Tier 1 (universal interventions for all students; Tier 2
(targeted group support for some students); and Tier 3
(individual support for a few students)
b) Individual students and targeted groups
c) Academic systems and behavioral systems
d) Intervention strategies
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 166
7. Multiple choice. Choose the best option to complete the sentence.
Foundational school-wide systems feature…
a) PBIS implementation that is clear and a priority to the
district
b) Resources that support implementation
c) Alignment and integration of PBIS with other district
priorities, needs, and initiatives
d) All of the above
8. Multiple choice. Choose the best answer to complete the sentence.
You are the Director of Student Support Services, there is an increase on
suspension rates and expulsions rates district-wide despite implementing
PBIS.
You should:
a) engage in fabricating formative evaluation questions that
measure various elements of PBIS intervention practices
b) engage in fabricating summative evaluation questions
that measure various elements of PBIS intervention
practices
c) engage in fabricating formative and summative
evaluation questions that measure elements of PBIS
intervention practices
d) ensure there is uniformity across the district and engage
in fabricating formative and summative evaluation
questions that measure various elements of PBIS
intervention practices
9. Multiple choice. Complete the sentence and choose all that apply to your
evaluation approach.
I evaluate when...
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 167
a) I need to ensure there is a strong foundation before I
expand or develop
b) I need to assess for readiness and commitment from
those participating
c) I need to determine if ongoing training is needed to improve
efficiency and effectiveness.
d) All of the above
10. Multiple choice. Complete the sentence and choose all that apply when analyzing
evaluation results.
When I analyze results I…
a) organize and store the data
b) organize, review, share, and monitor the data
c) review, share, and monitor the data
d) share and monitor the data
11. Multiple choice. Choose the best answer to complete the sentence.
Intervention effectiveness could result in…
a) revisions to intervention practices, such as procedures,
intensity, and/or monitoring implementation integrity.
b) designing a new intervention.
c) implementing a new program.
d) changing the structure of the program.
12. Multiple choice. Complete the sentence.
I self-evaluate by…
a) regularly monitoring my progress towards my goals.
b) thinking through alternatives before determining my
answer.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 168
c) motivating myself.
d) All of the above.
13. Multiple choice. Complete the sentence.
We (as a group) evaluate our own performance by…
a) providing feedback.
b) discussing what is working and what is not working.
c) making adjustments.
d) All of the above.
Motivation
14. Put these sentences in order of your value.
___ designing an evaluation plan for PBIS interventions.
___ implementing an evaluation plan for PBIS interventions.
___ monitoring progress of PBIS interventions.
___ determining effectiveness of PBIS interventions.
OR
Put these sentences in order of your value.
___ I enjoy PBIS and its interventions to improve student
behavior.
___ Learning how to create an evaluation plan for PBIS
interventions is valuable/useful for me in terms of my future goals.
___ As a PBIS committee member, it is important for me to learn
how to design, implement, monitor, and determine effectiveness of
PBIS interventions.
___ Being involved in the PBIS committee is worth it to me even
if it takes more time than expected.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 169
15. Rate your degree of confidence in doing the following by indicating strongly
agree to strongly disagree using the scale below:
I can influence students, parents, and administrators to comply with PBIS
interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
If I plan accordingly, I can meet deadlines
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I am confident in my ability to design an evaluation plan for PBIS
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I am confident in my ability to monitor PBIS interventions once they have
been implemented.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I am confident in my ability to assess effectiveness of PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 170
16. Rate your degree of positivity in doing the following by indicating strongly agree
to strongly disagree using the scale below:
I feel positive about designing a comprehensive evaluation system.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I feel positive when implementing a comprehensive evaluation
system.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I feel positive when monitoring a comprehensive evaluation
system.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
I feel positive when determining effectiveness of PBIS
interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
Organization
17. Please indicate the extent to which the item is present at your your school site
using the scale below:
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 171
My school site provides planning time to design, implement, and
monitor compliance of PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
18. indicate the extent to which the item is present at your your school site using the
scale below:
I receive support from the administrative staff at my school site.
OR
The administrative team at my school site supports my efforts.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
19. Please indicate the extent to which the item is present at your your school site.
The district's policies align with SEESD’s PBIS implemented
interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
20. Please indicate the extent to which the item is present at your school site using the
scale below:
The district's mission of investing in optimal learning environments that
enhance safety and creates positive school climates aligns with PBIS
interventions at my school site.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 172
The district values and recognizes schools that apply PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
My school site values and recognizes those who apply PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
The district monitors and holds schools accountable for the
implementation of PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
My school site monitors and holds the entire school accountable for the
implementation of PBIS interventions.
___ Strongly agree
___ Agree
___ Neutral
___ Disagree
___ Strongly disagree
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 173
Appendix C
Interview Items
Knowledge
1. Tell me what SEESD’s goal is for the evaluation of PBIS.
2. In your own words, tell me the components of PBIS.
3. What are some measurement tools that evaluate PBIS interventions?
4. There are four categories to consider when measuring improvement of PBIS
interventions and/or areas of growth…
1.
2.
3.
4.
Please provide an example of these categories and how they each affect
the outcome.
5. Please explain the relationship between the three-tiers of support.
6. Please provide examples of essential features that are needed when implementing
a school-wide PBIS system. How do these features influence the implementation
of PBIS?
7. Walk me through the steps you use to ensure PBIS interventions are being
implemented.
8. When and how do you determine an evaluation system is needed?
9. Walk me through the process of obtaining PBIS evaluation results and
determining if progress was made.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 174
10. Please explain how data results influence the way PBIS interventions are
incorporated into the schools.
OR
Please give an example of how you use data results to influence the way PBIS
interventions are incorporated into the schools.
11. How do you evaluate the effectiveness of your own abilities to create a
comprehensive evaluation system?
12. How do you evaluate the effectiveness of the committee’s evaluation system?
Motivation
13. How valuable is the design and implementation of PBIS interventions?
14. Could you discuss some of your reasons why we should design or implement an
evaluation system?
15. To what degree do you feel confident about your ability to create an evaluation
system that will monitor and ensure compliance of PBIS interventions?
16. Tell me about a time when you felt confident creating an evaluation system?
17. What impacts your confidence? How consistent is it?
18. Describe how you feel about designing, implementing, monitoring, and
determining effectiveness of PBIS interventions.
19. In the face of challenges and setbacks, do you feel positive about engaging in the
creation of a comprehensive evaluation system?
20. Tell me how you feel about designing a comprehensive evaluation system for
PBIS?
21. Tell me how you feel about implementing a comprehensive evaluation system
for PBIS?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 175
22. Tell me how you feel about monitoring a comprehensive evaluation system for
PBIS?
23. Tell me how you feel about determining effectiveness of a program?
Organization
24. What kind of planning time does your school allocate for PBIS? How much
time? How do you utilize this time?
25. What type of support has your school provided to design, implement, and monitor
PBIS?
26. If things are not going well with the design, implementation, and monitoring of
PBIS interventions, how would the school administrators react?
27. To what extent do your district’s policies align with SEESD’s goal?
28. What extent do the district’s policies align with SEESD’s expected level of
implementation?
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 176
Appendix D
Informed Consent/Information Sheet
University of Southern California
Rossier School of Education
3470 Trousdale Pkwy, Los Angeles CA, 90089
An Innovative and Integrative Evaluation System for Positive Behavioral Interventions
and Supports to Improve Future Implementation and Accountability Approaches in Urban
School Districts
You are invited to participate in a research study. Research studies include only people who
voluntarily choose to take part. This document explains information about this study. You should
ask questions about anything that is unclear to you.
PURPOSE OF THE STUDY
This study aims to design and implement a comprehensive evaluation system that will
monitor and ensure compliance of PBIS interventions addressing its mission. Maria Ruelas
has been asked to conduct both interviews and observations, and has structured questions
revolving the topic of PBIS. Ms. Ruelas is particularly interested in understanding PBIS
evaluation systems implemented in elementary schools. Ms. Ruelas wants to gain an
understanding of the knowledge and skills, motivation, and organizational approaches that
influence the creation of a comprehensive evaluation system. Ms. Ruelas plans to interview
a school administrator, a school site social worker, and a teacher and conduct three
observations (1 in the classroom, 1 staff meeting or training, and child and adult interactions
in the school) to gain further information.
PARTICIPANT INVOLVEMENT
If you agree to take part in this study, you will be asked to perform the following critical
behaviors to achieve its performance goal:
a. create a comprehensive evaluation system;
b. implement the evaluation system; and
c. analyze the data produced by the evaluation system.
In addition, you will be asked a series of multiple-choice and open-ended questions to
assess certain components of PBIS.
CONFIDENTIALITY
This study is anonymous. We will not be collecting or retaining any information about
your identity. The records of this study will be kept strictly confidential. Research
records will be kept in a locked file, and all electronic information will be coded and
secured using a password protected file. We will not include any information in any
report we may publish that would make it possible to identify you. Your identity will not
be disclosed in the material that is published. However, you will be given the
opportunity to review and approve any material that is published about you.
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 177
Required language:
The members of the research team and the University of Southern California’s Human
Subjects Protection Program (HSPP) may access the data. The HSPP reviews and
monitors research studies to protect the rights and welfare of research subjects.
INVESTIGATOR CONTACT INFORMATION
The Principal Investigator is:
Maria Ruelas, MSW, PPSC
mruelas@usc.edu
The Faculty Advisors are:
Dr. Kenneth Yates
kennetay@usc.edu
323-591-4688
Dr. Melora Sundt
Sundt@rossier.usc.edu
310-403-6671
IRB CONTACT INFORMATION
University Park Institutional Review Board (UPIRB), 3720 South Flower Street #301, Los
Angeles, CA 90089-0702, (213) 821-5272 or upirb@usc.edu
ESTABLISHING A SYSTEMATIC EVALUATION OF PBIS 178
Appendix E
Recruitment Letter
Hello,
I am seeking your input through surveys and interviews for my dissertation study - An
Innovative and Integrative Evaluation System for Positive Behavioral Interventions and Supports
to Improve Future Implementation and Accountability Approaches in Urban School Districts.
The data collected through surveys and interviews will be used to help better understand the
knowledge, motivation, and organizational indices that impact the creation of a comprehensive
evaluation system.
Please complete the electronic survey by following the link below:
Qualtrics Link
The survey contains 20 questions, and should take about 40 minutes to complete. The survey
will remain open until xx/xx/xxxx
Additionally, I would appreciate it if I could schedule a subsequent interview between the dates
of xx/xx/xxxx and xx/xx/xxxx. Interviews will be scheduled for 1 hour. Please reply to this e-
mail with a date and time that will work for you within that time frame.
Results of surveys and interviews will be used for the purposes of the dissertation study - An
Innovative and Integrative Evaluation System for Positive Behavioral Interventions and Supports
to Improve Future Implementation and Accountability Approaches in Urban School Districts.
All information presented throughout this project will not include identifiable information.
If you have any questions or would like to be removed from future communications regarding
this study, please reply to this e-mail accordingly.
Maria Ruelas, MSW, PPSC
mruelas@usc.edu
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Positive behavior intervention support plan: a gap analysis
PDF
Establishing a systematic evaluation of an academic nursing program using a gap analysis framework
PDF
The implementation of a multi-tiered system of support at Downtown Unified School District: an analysis of teacher needs
PDF
Teacher perception on positive behavior interventions and supports’ (PBIS) cultivation for positive teacher-student relationships in high schools: an evaluation study
PDF
Examining the adoption of electronic health records system in patient care and students’ education using the GAP analysis approach
PDF
The implementation of response to intervention: an adapted gap analysis
PDF
Increasing parent involvement in social-emotional learning workshops in high school using the gap analysis approach
PDF
Promoting a positive school culture from three perspectives: a promising practices study from the administrator perspective
PDF
The characteristics of high schools that have successfully implemented Positive Behavioral Interventions and Supports
PDF
An examination using the gap analysis framework of employees’ perceptions of promising practices supporting teamwork in a federal agency
PDF
Evaluating collective impact in a local government: A gap analysis
PDF
Promoting equity in discipline practices for Latino students: a gap analysis
PDF
An examination of barriers to effective supervision from the perspective of employees within a federal agency using the GAP analysis framework
PDF
Examining teachers' roles in English learners achievement in language arts: a gap analysis
PDF
An examination of the facilitators of and barriers to effective supervision from the perspective of supervisors in a federal agency using the gap analysis framework
PDF
The interaction of teacher knowledge and motivation with organizational influences on the implementation of a hybrid reading intervention model taught in elementary grades
PDF
The implementation of a multi-tiered system of support in Downtown Unified School District: an analysis of site administrator needs
PDF
The moderating role of knowledge, motivation, and organizational influences on employee turnover: A gap analysis
PDF
Closing the compliance gap: an evaluation of influences impacting appropriate compliance risk response among pharmaceutical company managers
PDF
Implementation of the Social Justice Anchor Standards in the West Coast Unified School District: a gap analysis
Asset Metadata
Creator
Ruelas, Maria G.
(author)
Core Title
Establishing a systematic evaluation of positive behavioral interventions and supports to improve implementation and accountability approaches using a gap analysis framework
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
08/02/2018
Defense Date
03/23/2018
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
accountability,Compliance,evaluation,gap analysis,implementation,interventions,Knowledge,Motivation,OAI-PMH Harvest,organizational challenges,PBIS,school district,Skill
Format
application/pdf
(imt)
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Sundt, Melora (
committee chair
), Yates, Kenneth (
committee chair
), Hamilton, Betsy (
committee member
)
Creator Email
mariaruelas88@gmail.com,mruelas@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-46620
Unique identifier
UC11671939
Identifier
etd-RuelasMari-6605.pdf (filename),usctheses-c89-46620 (legacy record id)
Legacy Identifier
etd-RuelasMari-6605.pdf
Dmrecord
46620
Document Type
Dissertation
Format
application/pdf (imt)
Rights
Ruelas, Maria G.
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
accountability
evaluation
gap analysis
implementation
interventions
organizational challenges
PBIS
school district