Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
A professional development program evaluation: teacher efficacy, learning, and transfer
(USC Thesis Other)
A professional development program evaluation: teacher efficacy, learning, and transfer
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
A PROFESSIONAL DEVELOPMENT PROGRAM EVALUATION:
TEACHER EFFICACY, LEARNING, AND TRANSFER
by
Dana Anne Miyuki Tomonari
__________________________________________________________
A Dissertation Presented to the
FACULTY OF THE USC ROSSIER SCHOOL OF EDUCATION
UNIVERSITY OF SOUTHERN CALIFORNIA
In Partial Fulfillment of the
Requirements for the Degree
DOCTOR OF EDUCATION
May 2012
Copyright 2012 Dana Anne Miyuki Tomonari
ii
DEDICATION
This dissertation is dedicated to my grandfather, Mitsuo “Mike” Tsugawa,
who is my hero. Your service and dedication, both during and after World War II, to
friends, family, and country have set a high standard that I use to measure my own
actions and choices. You have taught me to stand up for what I believe in and to
work hard to achieve my dreams. I love you gramps!
iii
ACKNOWLEDGEMENTS
I have been truly blessed to have so many people who have supported me
throughout my USC journey. First and foremost, I would like to thank my family,
whose love and support have been my safe harbor. Although we are separated by
distance, we are always together in our hearts.
Mahalo nui to my committee chairperson, Dr. Melora Sundt. Your support
through this process has been invaluable and your attention to detail in the writing
process has challenged me to become a better writer.
I would also like to thank my committee members, Dr. Dominic Brewer and
Dr. Darnell Cole. It has been my privilege to learn from both of you. Thank you for
pushing me to ask thoughtful questions and to think critically.
A huge mahalo to Dr. Dennis Hocevar, for his patience and guidance through
the data analysis process, and Kevin Collins, for his assistance with IRB process.
Team Melora…we did it! I am blessed to count you as friends and know that
the bonds we have formed will last a life time! A very special thank you to Drs. Kari
Luna-Nunokawa, Laure Burke, and Babette Moreno. Thanks for talking me off the
ledge more than once and for giving me the inspiration I needed when times were
tough.
Cil, Melissa, Stacie, Jo, and Maddy - you have been there for me through
thick and thin. I can’t thank you enough for always having my back and
encouraging me to keep pushing forward. You girls are my sisters at heart and I love
you.
iv
Mahalo to my colleagues at Kamehameha School for participating in and
supporting my work. I am privileged to work with such amazing educators! Imua!
v
TABLE OF CONTENTS
Dedication ii
Acknowledgements iii
List of Tables vi
List of Figures vii
Abstract viii
Chapter One: Overview of the Study 1
Chapter Two: Literature Review 22
Chapter Three: Methodology 64
Chapter Four: Results 92
Chapter Five: Discussion 120
References 138
Appendices 147
Appendix A: Andragogical Learner Analysis 147
Appendix B: Interview Protocol 148
Appendix C: Challenging Behavior Process Training Power Point 149
Appendix D: Pre-Test Survey 175
Appendix E: Post-Test Survey 182
Appendix F: Follow-Up Survey 190
vi
LIST OF TABLES
Table 1: Comparison of Pedagogical and Andragogical Assumptions 30
Table 2: TSES Short Form Reliability 81
Table 3: Research Questions, Survey Items, and Statistical Analyses 89
Table 4: Response Rates 93
Table 5: Sample Demographics 94
Table 6: Participant Evaluation of Coherence 99
Table 7: Type and Frequency of Needed Support 100
Table 8: Type of Support Expected 101
Table 9: Open Ended Responses to Follow-up Surveys 102
Table 10: Andragogical Learner Analysis 104
Table 11: Features of Effective Professional Development 107
Table 12: Participant Evaluation of Professional Development Experience 108
Table 13: Mean Pre- and Post-test Scores for TSES 110
Table 14: Number of Participants with Correct Answer 112
vii
LIST OF FIGURES
Figure 1. Andragogy in Practice model 34
Figure 2. Pre- and Post-test answers to Knowledge Item 2: “Who may 114
be on the Regional Implementation Team (RIT)? (check all
that apply)”
Figure 3. Pre- and Post-test answers to Knowledge Item 4: “Which is NOT 115
one of the purposes of the Intervention Plan Meeting? (check all
that apply)”
Figure 4. Pre- and Post-test answers to Knowledge Item 5: “Which of the 116
following is used to address challenging behaviors that are serious
safety concerns and extreme? (check all that apply)”
viii
ABSTRACT
Although professional development is an important means of improving both
teachers’ skills and student outcomes, there is a dearth of high quality empirical
research on the efficacy of such efforts. The efficacy of the Challenging Behavior
Process was assessed using a mixed method approach which included the use of pre-,
post-, and follow-up surveys. The participants were preschool teachers who worked
for Kamehameha Schools on the island of Oahu. The relationship of the training to
teacher efficacy, learning, and transfer was assessed. The analyses determined that
the training was well-designed to meet the needs of the teaching staff, participants
reacted positively to the training, and participation in the training was related to
positive changes in knowledge about the CB process. However, participation in the
training was also related to a statistically significant decrease in teacher efficacy for
teaching strategies at the time of the follow-up and a number of knowledge gaps
were uncovered. Finally, the study revealed a reporting rate for challenging
behaviors that was much lower than expected. The overarching implication drawn
from this study is that comprehensive evaluation of professional development in
education is both necessary and valuable. It is not enough to collect data about how
an initiative is working. It is important to take time to put the pieces of the
professional development puzzle together to determine how and if professional
development efforts create or support change in the classroom.
1
CHAPTER ONE
OVERVIEW OF THE STUDY
Education reforms and initiatives have set ambitious goals for improving
student outcomes (No Child Left Behind Act, 2001). For example, the No Child Left
Behind Act of 2001 (NCLB) specifies that all students in the United States will be
proficient in reading and math by 2014. The classroom level changes that are
necessary to meet the ambitious goals of many school reforms are highly dependent
on having effective and highly trained teachers (Spillane, 1999). The importance of
effective and highly trained teachers is supported by an analysis of national survey
and case study data that indicate teacher quality is more strongly related to improved
student outcomes than are student demographic characteristics and school level
variables such as class size and overall spending levels (Darling-Hammond, 1999).
The importance of teacher quality is further supported by value-added studies that
demonstrate the positive effects of effective teachers on student outcomes (Jordan,
Mendro, & Weerasinghe, 1997; Kane, Rockoff, & Staiger, 2006; Sanders & Horn,
1998).
In addition to setting high goals for student achievement, many educational
reforms also call for the transformation of the traditional roles and responsibilities of
classroom teachers (Guskey, 2000). These responsibilities extend beyond the
immediate teacher-student relationship and may include participation in school
governance through shared-decision making, increasing use of data-driven decision
making paradigms, and greater responsibility for engaging families and communities
2
in the education process. (Guskey, 2000; Wohlstetter, Datnow, & Park, 2008). Ball
and Cohen (1999) further note that the type of instruction often called for by
education reform is not commonplace, nor can teachers change the way they teach
simply by being required to do so. Teachers, “need opportunities to reconsider their
current practices and to examine others, as well as to learn more about the subjects
and students they teach” (Ball & Cohen, 1999, p. 3). These observations are even
more relevant when considering the increasing demands and expectations for
teachers that have since been set forth as part of NCLB legislation.
The need for effective and highly trained teachers is especially evident in the
field of early childhood education (ECE). Researchers and policy makers are
increasingly aware of the benefits afforded to children who participate in high
quality early childhood education. Research demonstrates that participation in high-
quality ECE programs can produce positive long-term academic and social outcomes
for young children (Barnett, 2003; Bowman, Donovan, & Burns, 2001; NAEYC,
2003). Unfortunately, these benefits do not accrue when children attend lower
quality programs. Of particular concern is the finding that programs of poor quality
have demonstrated potentially negative outcomes, particularly for social-emotional
development (Barnett, 2003, 2008).
Similar to findings in the K-12 education system, research indicates that
teacher quality is a key contributor to ECE program quality (Barnett, 2003). Yet,
unlike their counterparts in the public education system, preschool teachers are
required to possess relatively little education or prior experience to work with young
3
children. In fact, forty-two states require only a high school diploma to teach in a
licensed child-care facility (Barnett, 2003). Barnett (2003) notes that, while a
bachelors degree is a requirement to teach in the public education system, fewer than
50% of all preschool teachers in the U.S. possess a four-year degree in ECE. Given
this lack of pre-service education and training in the ECE field, in-service teacher
education plays an even more important role in creating positive outcomes for
preschoolers (Winton & McCollum, 2008).
Given the ambitious goals for student learning and the changing expectations
and roles of the classroom teacher, teachers will need significant training and
continuing education in order to successfully enact the types of changes that will
ultimately impact student learning and school improvement (Ball & Cohen, 1999;
Wilson & Berne, 1999). Hill (2007) identifies two main continuing education
pathways used to increase the skills of in-service educators: graduate education and
professional development.
Hill (2007) notes that enrollment in graduate education is fairly common in
the teaching workforce. One reason that many teachers choose to pursue graduate
education is that incentives for doing so are strong (Hill, 2007; Whitehurst, 2002).
Many states allow graduate coursework to count towards recertification requirements
and many districts provide salary increases to teachers who obtain graduate degrees.
Because of these incentives, approximately 49% of public school teachers hold a
Masters degree (Parsad, Lewis, Farris, & Greene, 2001). Although participation in
graduate degree programs is common and often rewarded, there is little current
4
evidence that this type of continuing education translates to improved student
outcomes (Clotfelter, Ladd, & Vigdor, 2007; Whitehurst, 2002). Hill (2007) notes
that descriptions of graduate programs indicate that many programs offer coursework
that is lacking in rigor. Rather than creating a coherent plan for learning, teachers
take coursework to fulfill state requirements resulting in a fragmented course of
study often removed from classroom practice (Hill, 2007). So, although graduate
education is prevalent amongst educators, there is little evidence that investment in
this type of professional development results in improved student outcomes.
Even more prevalent than graduate study is the use of professional
development to provide continuing education to in-service teachers (Hill, 2007).
Professional development encompasses a wide array of activities and formats which
include, but are not limited to, workshops, teaching institutes, lesson study,
mentoring, coaching, team meetings, and professional learning communities
(Buysse, Winton, & Rous, 2009; Guskey, 2000; Hill, 2007). Because professional
development is often required by school districts, 99% of public school teachers
surveyed by the National Center for Educational Statistics (NCES) report
participating in professional development activities in the surveyed 12 month period
(Lewis et al., 1999). These statistics indicate that professional development is the
most utilized method for providing teachers with the training and education to
successfully achieve the ambitious goals of our education system.
Recognizing that teachers need opportunities for continuing education and
that professional development is the most utilized method of obtaining that training,
5
there has been an increased call for schools to provide teachers with effective,
empirically-validated, professional development (“No Child Left Behind Act of
2001,” 2001; The Teaching Commission, 2004). In the report “Teaching at Risk: A
Call to Action,” The Teaching Commission (2004) makes the case for ongoing,
targeted, and effective professional development by arguing that, “helping our
teachers to succeed and enabling our children to learn is an investment in human
potential, one that is essential to guaranteeing America’s future freedom and
prosperity” (p. 11). This argument is reinforced by NCLB requirements that states
provide professional development opportunities that are, “based on scientific
research…are high quality, sustained, intensive, and classroom-focused…and are not
1-day or short-term workshops” (“No Child Left Behind Act of 2001,” 2001, p.
1963). Through an evaluation of a program that prepares teachers to support and
address the needs of students who display challenging behaviors, this study explores
how educational organizations can evaluate the effectiveness of professional
development efforts in order to maximize student and teacher outcomes.
Background of the Problem
Despite the large amounts of time and resources invested in professional
development efforts (Hill, 2009), the professional development opportunities that are
available to teachers remain inadequate (Borko, 2004). “Teacher learning has
traditionally been a patchwork of opportunities-formal and informal, mandatory and
voluntary, serendipitous and planned — stitched together into a fragmented and
incoherent ‘curriculum’” (Wilson & Berne, 1999, p. 174). Typical professional
6
development opportunities are often seen as irrelevant, boring, and of little use to
teachers (Wilson & Berne, 1999). Ball and Cohen (1999) attribute this situation to
misguided perceptions and attitudes towards and about professional development
rather than a lack of funding. Rather than recognizing the need for deep, sustained
learning of content and pedagogy, the purpose of most professional development
efforts is seen as a simple updating of teachers’ skills. The results of such
misconceptions are fragmented, superficial workshops that rarely consider the needs
and dispositions of teachers as adult learners.
Although there is a call for high-quality, empirically validated professional
development for teachers, there has been little guidance as to how that professional
development should be delivered, what content should be covered, or what
constitutes “high-quality” professional development (Ball & Cohen, 1999; Borko,
2004). This inadequacy has led to a growing focus on developing a research agenda
that will not only validate the effectiveness of professional development
opportunities but also determine, “what and how teachers learn from professional
development,” and how that learning ultimately affects student outcomes (Borko,
2004, p. 4). This section will discuss what is currently known about professional
development, the importance of increased research in this area, and the difficulties
associated with such research.
Professional Development Research: Water, water, everywhere…
Meta-analysis shows that, despite the large body of literature that addresses
professional development, there are relatively few high quality studies from which
7
we can draw conclusions about the characteristics of effective professional
development (Garet, Porter, Desimone, Birman, & Yoon, 2001; Guskey, 2009;
Yoon, Duncan, Lee, Scarloss, & Shapley, 2007). Yoon, Duncan, Lee, Scarloss, and
Shapley (2007) examined over 1,300 studies that potentially addressed the effects of
professional development on student achievement. Only nine of those studies met
the evidence standards set forth by What Works Clearinghouse, clear evidence of the
paucity of quality empirical research in this area. In addition to this lack of rigor,
many studies focus solely on participant reactions to the professional development
experience; very few studies have examined the relationship between specific
features of professional development and changes in teachers’ classroom practice or
in student outcomes (Garet et al., 2001; Kutner, Sherman, Tibbets, & Condelli,
1997).
Despite the scarcity of rigorous research that examines the mechanisms by
which professional development impacts teacher learning and student achievement,
Guskey (2000) has stated that, in his opinion, school reforms are never successfully
implemented without effective professional development. A review of studies which
examine the implementation of Comprehensive School Reform (CSR) models
suggests that a well-designed model for reform is not enough; teachers need
significant professional development in order to successfully implement reform
efforts (Desimone, 2002). Also consistent amongst studies of CSR models is that
lack of training is often cited as a reason for weak implementation of reform efforts
(Desimone, 2002).
8
A study of teacher professional development programs in math and science
conducted by the Council of Chief State School Officers (CCSSO) reviewed
evaluation studies of 25 professional development programs in 14 states (Blank, De
las Alas, & Smith, 2008). While not specifying the characteristics of programs that
led to improved outcomes, a number of studies reviewed showed positive,
measureable effects. In particular, measureable effects of professional development
activities on subsequent student performance were reported in seven of the studies
reviewed. Ten studies reported measureable effects on increasing teacher content
knowledge and four studies reported effects on teachers’ instructional practices.
And although only nine studies met the evidence standards, the meta-analysis
conducted by Yoon et al. (2007) indicated that teachers who receive substantial
professional development, an average of 49 hours across the nine studies, can boost
student achievement by an average of 21 percentage points. These nine studies
evaluated professional development programs in math, science, and
reading/language arts. The average effect size for all nine studies was 0.54,
indicating a moderate effect of professional development on student achievement.
Effect sizes remained consistent across content areas; average effect size in science
was 0.51; in mathematics, 0.57; and in reading and English/language arts, 0.53.
While these studies did not indicate how or why the professional
development program positively impacted student learning, the findings of Yoon et
al. (2007), Blank, De las Alas, and Smith (2008), and Desimone (2002) indicate that
9
professional development has the potential to produce gains in student outcomes
(Borko, 2004; Garet et al., 2001; Yoon et al., 2007).
The Difficulties of Studying Professional Development
Guskey (2000) notes three specific problems with past efforts to identify
effective elements of professional development. The first problem is a lack of
agreement on appropriate criteria for measuring effectiveness. Reviews of the
research literature demonstrate that studies using student learning as the principle
criteria for determining effectiveness are rare. A large number of evaluations of
professional development rely solely on participants’ responses to the learning
experience (Kutner et al., 1997). Other research has focused on participants’ attitude
change as a result of participation while others focus on the implementation of new
skills. This confusion in the research literature over the criteria of effectiveness
makes it difficult to compare results across studies.
A second problem with past efforts to identify effective professional
development is that researchers often focus on identifying elements and processes
that are consistent across programs and contexts. Guskey notes that while meta-
analyses that demonstrate the overall effects of professional development are useful,
these analyses also ignore much of the important information about professional
development uncovered in the individual studies. In other words, meta-analysis
ignores the complexities inherent in any educational endeavor. Guskey argues that
rather than search for main effects that are applicable across contexts, researchers
10
should instead seek to determine the conditions and contexts in which professional
development is likely to have positive effects.
The third problem with past efforts to identify effective professional
development is that research tends to focus on the presence or absence of particular
elements of professional development rather than how well that particular element
was implemented (Guskey, 2000). Quantity, the presence or absence of an element,
is easily measured but quality is more difficult to assess and therefore often
neglected. For example, it is fairly easy to count instances of collaboration that
occur during professional development activities, but far more difficult to determine
the quality and impact of those collaborative activities (Guskey, 2000).
In addition to the complexities described above, professional development,
like any other educational endeavor, is multifaceted; schools rarely implement just
one innovation, so determining the effect of any one innovation is difficult (Guskey,
2000). These factors contribute to the difficulty in studying the effectiveness of
professional development efforts (Guskey, 2009). In fact, it is because of this
complexity and the perceived difficulty of evaluating professional development that
many schools rarely plan for or allocate resources towards evaluating the
effectiveness of the professional development provided to teachers (Guskey, 2000).
A situation which adds to the research community’s difficulty identifying elements
of effective professional development.
In light of these complexities, Guskey (1994, 2009) would argue that there
are no generalizable best practices that will be effective for all teachers or all
11
schools; every school is different and therefore context “trumps” content and
process. Which begs the question, if there are no generalizable best practices, why
bother to study professional development at all? The answer is that, despite the
complexities, effective professional development can help teachers to achieve the
educational goals and outcomes laid out in recent initiatives. High quality, effective
teachers can make a difference in student achievement and success. The following
section outlines an evaluation framework and a research agenda that can be used to
address the complexities identified by Guskey and move the field toward a more
nuanced understanding of the features of effective professional development.
Statement of the Problem
In the current climate of heightened accountability, educators can no longer
neglect the evaluation of professional development efforts. Guskey (2000) argues
that it is not acceptable for schools and teachers to invest time and resources into
questionably effective professional development activities. Educators should be able
to demonstrate that professional development activities result in positive changes in
classroom practice and improved student outcomes in order to justify the investments
of time and resources.
Evaluation can also provide better information to guide school reform and
professional development efforts (Guskey, 2000). At this time schools leaders, at the
state, district, and school levels, have little to guide them in choosing effective and
meaningful professional development programs, providers, or guidelines for staff.
Rigorous study of professional development and its effectiveness can provide
12
administrators and teachers with more reliable guidelines for designing and/or
choosing effective programs.
Effective Evaluation of Professional Development
In order to fill the gaps in the research literature discussed earlier, it is
important for studies to address the challenges identified by Gusky (2000). This
section outlines the method of evaluation used in the current study and how this
study fits within a proposed research agenda to extend the field’s understanding of
effective professional development.
Designing effective professional development. The first step undertaken in
this study was to examine the design of the chosen professional development
activity. As noted earlier, two of the difficulties associated with identifying effective
professional development are that: (1) researchers often focus on the presence or
absence of particular elements of professional development rather than the quality of
their implementation and (2) research often ignores the influence of context on
professional development efforts (Guskey, 2000). Andragogy, the study of adult
learning, and the research literature which focuses on effective features of
professional development were used to evaluate the quality of design of the
professional development activity and to also consider the impact of contextual
variables. These two bodies of literature will be explored in depth in Chapter Two.
Following this evaluation of the activity’s design, the study examined the
effectiveness of the activity using the Kirkpatrick framework (2001, 2006; D. L.
Kirkpatrick & Kirkpatrick, 2007).
13
Evaluating professional development. The second stage of this study then
addressed the third problem identified by Guskey (2000), the lack of agreement on
appropriate criteria for measuring effectiveness. The study used the Kirkpatrick’s
(2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007) four-level evaluation
framework to address this issue. The Kirkpatrick framework was originally created
to evaluate the effect of training programs in business and industry, but has since
been applied in the field of education to evaluate professional development (Guskey,
2000). The four-levels of evaluation outlined by Kirkpatrick (2001, 2006; D. L.
Kirkpatrick & Kirkpatrick, 2007) are: (1) Reactions, (2) Learning, (3) Transfer, and
(4) Results. The model provides a framework for evaluating a wide range of
outcomes that can be compared across studies. The Kirkpatrick framework will be
further examined in Chapter Two.
A professional development research agenda. Noting the paucity of
quality research on effective professional development, Borko (2004) proposed a
research agenda using a situative perspective that, “allows for multiple perspectives
and multiple units of analysis” (Borko, 2004, p. 5). The situative perspective taken
by Borko (2004) effectively addresses the research concerns raised by Guskey (1994,
2000, 2009) by emphasizing the importance of contextual variables.
Borko first identifies four key elements of the professional development
system: (1) the professional development program, (2) the teachers (i.e.,
participants), (3) the facilitator, and (4) the context in which the professional
development occurs. She then frames a three-phase research agenda that will move
14
educational research towards the goal of providing effective, high-quality
professional development for all teachers. The first phase of the research agenda
focuses on evaluating a single professional development program at a single site.
This phase focuses solely on the relationship between the professional development
program itself and the teachers. The goal of evaluation at this level is to demonstrate
that a particular program can have a positive impact on learning. This phase of the
research agenda, proposed by Borko (2004), has received the most attention from the
research community to date and was the focus of the current study.
Phase two of Borko’s research agenda focuses on the study of a single
professional development program implemented by multiple facilitators at multiple
sites. The goal at this stage is to determine whether a particular program can be
enacted with integrity in different settings and with different facilitators. Finally,
phase three of the research agenda focuses on the comparison of multiple
professional development programs enacted in multiple sites. The goal of phase
three research is to provide comparative information about the implementation of a
wide range of professional development programs. Due to the scope of the current
study, phases two and three were not specifically addressed. However, data about
the context of the study were used to provide limited insight into these phases as well
as directions for future research.
The Current Study
Like many organizations, Kamehameha Schools (KS) invests significant time
and resources towards in-house professional development efforts. KS has also
15
invested significant resources to using data for program planning and accountability.
As is common with most evaluations of professional development, The Community
Based Early Childhood Education division’s (CBECE) current professional
development evaluations tend to measure teachers’ reactions to professional
development rather than the degree to which teachers learned the material or
transferred that knowledge to classroom practice. While extensive outcome data and
evaluations of classroom practice are collected by the division, these data, which
directly pertain to transfer and to student outcomes, are not linked back to specific
professional development efforts. To date there has not been a great deal of focus on
uncovering the relationship between specific elements of professional development
and teacher and student outcomes.
Purpose of the Study
The purpose of this study was to evaluate the quality and effectiveness of a
professional development activity called the Challenging Behavior Process (CB
Process) training using a mixed methods approach. The CB Process training is a part
of a division wide Positive Behavior Support (PBS) initiative that is in its fourth year
of implementation. The quality of the training was first evaluated using criteria
drawn from the literature about andragogy and effective professional development.
Demographic and qualitative data were used to provide a rich description of the
context in which the professional development training was implemented and
informed the evaluation of the professional development design. Then levels one,
two, and three of the Kirkpatrick framework (2001, 2006; D. L. Kirkpatrick &
16
Kirkpatrick, 2007) were used to evaluate the outcomes of the training. In addition to
the evaluation of teacher reactions (Kirkpatrick Level One) that is often used in
evaluating professional development, the current study directly evaluated changes in
teachers' knowledge and attitudes (Kirkpatrick Level Two) following the training
and the relationship between the professional development activity and changes in
teachers’ classroom practice (Kirkpatrick Level Three: Transfer). This was
accomplished by examining the relationship between participation in training and the
following outcomes: teachers’ reactions, teachers’ self-efficacy, teachers’ knowledge
about the CB process, and teachers’ self-reported ability success with the CB
Process.
Research Questions
The research questions guiding this investigation are:
1. Was the training designed to effectively meet the needs of teachers
according to andragogical theory and the criteria for effective
professional development?
a. Kirkpatrick Level 1 (Reaction): What were participants’ reactions
to the CB Process training?
2. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ perceived self-
efficacy?
17
3. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ understanding
of the CB Process?
a. Is there a relationship between participants’ evaluation of the CB
Process training and participants’ understanding of the CB
Process?
4. Kirkpatrick Level 3 (Transfer): Do participants report being able to
successfully implement the objectives taught in the CB Process training?
Significance of the Study
This study utilized levels one, two, and three of the Kirkpatrick framework
(2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007) to evaluate the effectiveness of
the CB Process training. The results of this study will help CBECE administrators
at KS to evaluate the effectiveness of current training efforts on increasing teachers’
understanding, self-efficacy, and ability to apply knowledge gained from
professional development in actual practice. The focus of this study is a single,
mandatory, division-wide professional development training. The study evaluated
the effectiveness of the training while also considering a variety of contextual
variables that informed the findings. Penuel et al. (2007, p. 953) assert that,
“multiple studies are necessary to determine what works in professional
development.” This study provides important information about professional
development in the KS CBECE context that lays a foundation for future professional
18
development efforts undertaken by the division and provides suggestions for
improving evaluation efforts.
Definition of Terms
AEC. Assistant Educational Coordinator. The assistant administrator of
programs at the regional level. The AEC is a member of the RIT and reports to the
Educational Coordinator.
Andragogy. A theory advanced by Malcolm Knowles that specifically
addresses the needs and characteristics of adult learners.
BIR. Behavior Incident Report. Form used by teachers to document the
details of challenging behaviors that occur in the classroom.
CB Process. Challenging Behavior Process. The formal process for
documenting and addressing Challenging Behaviors.
Challenging Behavior. Includes (1) any repeated pattern of behavior that
interferes with learning or engagement in pro-social interactions with peers and
adults and/or (2) any behavior that poses a serious safety concern or is disruptive to
the entire class.
EC. Educational Coordinator. The administrator of programs at the regional
level. The EC is a member of the Regional Implementation Team and reports
directly to the Director.
FBA. Functional Behavioral Assessment. A behavioral assessment
conducted both at home and at school with the goal of determining the function that
a challenging behavior serves for a child.
19
IP. Intervention Plan. Formal plan that stipulates teacher, support staff,
administration, and family responsibilities for addressing challenging behaviors that
are minor and ongoing.
IS. Instructional Specialist. Member of the Regional Implementation Team
who provides direct curricular support to teaching staff.
ISP. Individualized Support Plan. Formal plan that stipulates teacher,
support staff, administration, and family responsibilities for addressing behaviors
that pose health and safety concerns and/or behaviors that have not responded to
supports provided by an Intervention Plan.
OC. Outreach Counselor. Member of the Regional Implementation Team
who provides counseling support to students, families, and teaching staff. The OC
takes the lead in creating and documenting IPs or ISPs.
PBS. Positive Behavior Support. A cognitive behavioral system designed to
support the social emotional development of young children.
PD. Professional Development; processes and activities designed to enhance
the professional knowledge, skills, and attitudes of educators with the goal of
improving student learning (Guskey, 2000).
RIT. Regional Implementation Team. Regional Support Team which
includes the EC, AEC, IS, and OC. The RIT is responsible for implementing the CB
Process and providing regional training and support to teaching staff.
20
Self-Efficacy. A theory originally proposed by Alfred Bandura (1977) which
refers to a person’s beliefs concerning his or her ability to successfully perform a
given task or behavior.
Organization of the Study
This study is organized into five chapters. The first chapter provides an
introduction and background of the problem. High quality, effective professional
development opportunities provide teachers with the necessary training to impact
student achievement and to successfully meet the heightened expectations of today’s
education system. Although educational organizations invest a great deal of time
and resources in professional development, there these efforts are often irrelevant or
ineffective. Furthermore, organizations often lack the time and resources to
adequately evaluate the effectiveness of professional development efforts. The aim
of this study is to examine the impact of one professional development training on
teacher learning, teacher self-efficacy, and implementation of skills.
Chapter Two provides a literature review of the problem. The chapter
examines how the adult learning theory, Andragogy, can be used to inform
professional development efforts. The Andragogy in Practice Model is introduced as
a method of evaluating the design of professional development. Core features of
professional development that have been identified in the literature and informed by
the theory of Andragogy are discussed. The chapter also examines the current
literature surrounding significant outcomes of professional development, teacher
learning and teacher self-efficacy.
21
Chapter Three presents the methodology proposed for the study, including
the research design, population and sampling procedure, the selection and/or
development of the instruments, information about validity and reliability, the
procedures for data collection, and the plan for data analysis.
Chapter Four reports the findings of the study. First, the sample
demographics that inform the findings are presented. Next, the research findings,
organized by research question, are presented. The chapter concludes with a
summary of findings.
Chapter Five provides a brief summary of the findings and a discussion of the
implications for practice, limitations of the study, conclusions.
22
CHAPTER TWO
LITERATURE REVIEW
Introduction
In the current era of heightened accountability, it is increasingly important to
evaluate the professional development that is provided to teachers. It is no longer
acceptable for schools and teachers to invest in professional development activities
that do not result in positive outcomes for both teachers and students (Guskey, 2000).
The framework developed by Kirkpatrick (D. L. Kirkpatrick & Kirkpatrick, 2007)
can assist educators in assessing the success of any professional development
program at four levels: reactions, learning, transfer, and student outcomes.
Evaluation can assist educational leaders in providing teachers with effective
learning opportunities that can help to improve classroom practice and, by extension,
student outcomes. This is especially true in the field of Early Childhood Education
where teacher education varies substantially and is generally quite low (Barnett,
2003). Research has demonstrated that teacher quality is strongly related to
classroom quality and child outcomes; particularly in the area of social and
emotional development (Barnett, 2003; Bowman, Donovan, & Burns, 2001; Copple
& Bredekamp, 2009; Snider & Fu, 1990). While raising the education requirements
for teachers show some promise for improving the field overall, research has also
shown that a four-year degree may be necessary, but not sufficient to ensure teacher
quality (Early et al., 2006). Early and colleagues (2006) suggest that the field would
also benefit from specific, effective professional development for all teachers.
23
Kirkpatrick (D. L. Kirkpatrick & Kirkpatrick, 2007) asserts that effective
evaluations of professional development should go beyond assessing participant
reactions. To justify the investment of time and resources, professional development
should demonstrate effectiveness at promoting both teacher learning, changes in
teachers’ classroom practices, and student outcomes; Kirkpatrick refers to this as “a
compelling chain of evidence as to the value of learning to the bottom line” (D. L.
Kirkpatrick & Kirkpatrick, 2007, p. 123).
The preceding argument establishes the need to evaluate the effectiveness of
teacher professional development. However, this argument begs the question of
what makes a professional development opportunity effective in the first place?
Guskey (2000) notes that many reform efforts specify the need for professional
development, but offer little guidance for educators as to what constitutes effective
professional development. In order to better understand what effective professional
development looks like and to provide a framework for the evaluation that is the
focus of this study, this literature review will focus on two main bodies of research
that offer insight about the factors that determine the effectiveness of professional
development. These two bodies of research are: (1) the study of andragogy and (2)
the study of characteristics of effective professional development.
This chapter will provide an in depth review of the research literature
pertaining to andragogy and effective professional development followed by a
detailed discussion of Kirkpatrick’s four levels of evaluation. Each section will also
discuss the relation between the research literature and the current study.
24
Adult Learning Theory
Andragogy, which literally means the art and science of leading adults, is a
theory advanced in the U.S. by Malcolm Knowles. The term andragogy has been
used to differentiate the process of teaching adults from pedagogy, the art and
science of leading children. Knowles (1978) notes that the education of adults has
long been a concern of the human race, but that the focus of attention has often been
on the end results of education rather than the actual process of adult learning. The
study of learning has traditionally focused on the study of children and animals.
Although these learning theories have provided multiple perspectives and insights in
to the learning process, Knowles (1978) notes that well into the twentieth century
pedagogy was the only theoretical framework for education. In other words, the
underlying assumption of learning theory was that all learners, adults and children,
learned in the same ways and by the same means.
It was not until the end of World War I that educators began to recognize that
adult learners may possess needs and characteristics different from those of children.
Influenced by the work of theorists in fields including developmental, cognitive,
behavioral, humanistic, and social psychology; psychotherapy; sociology; and
education, Malcolm Knowles began forming a comprehensive theory of adult
learning called andragogy (Knowles, 1984). The concept of andragogy had been
evolving in Europe for some time, but it was Knowles who advanced the theory in
the U.S. and with whom the American theory of andragogy is most indentified.
Knowles’ sought to construct a theory that clearly differentiated the needs of adult
25
learners from the needs of young learners (Knowles, 1978). Because teacher
professional development is specifically aimed at adult learners, the theory of
andragogy offers the developers of professional development programs insights that
will maximize the effectiveness of learning opportunities for teachers.
Andragogy: Six Assumptions of Adult Learners
Knowles’ original theory is rooted in humanistic and pragmatic philosophies
which result in a theory that is primarily concerned with the self-actualization of
individuals and the value of knowledge that is gained through experience.
Additional influences of behaviorism and constructivism help to create a theory that
focuses on the learner and learning process rather than the social and political
outcomes of the learning process. The six core assumptions of andragogy are: (1)
the learner’s need to know, (2) self-concept of the learner, (3) prior experience of the
learner, (4) readiness to learn, (5) orientation to learning, and (6) motivation to learn.
The learner’s need to know. Before engaging in any learning opportunity,
adult learners need to know why they need to learn the material being presented.
Adults will spend considerable time and energy to determine the benefits of learning
something and the consequences of not learning it. Knowles suggests that a
responsibility of the adult educator is to convincingly argue the value of learning
something to improving the learners performance or quality of life. Keeping this
assumption in mind, teacher professional development should clearly define how the
material being presented is both relevant and beneficial to teachers’ performance.
26
Self-concept of the learner. Mature adults are those who have developed
the self-concept of being independent and responsible for their own decisions and
lives. Adults with this self-concept strive to be seen as self-directed, and resent
situations in which this independence and self-directedness are not respected.
Educational contexts in which learners are solely dependent on the teacher for the
transmission of knowledge are not congruent with the needs of the adult learner.
This assumption indicates that teacher professional development programs should
provide opportunity for teachers to engage in self-directed learning opportunities
where teachers are actively engaged in creating meaning for themselves.
Prior experience of the learner. Knowles notes that adults have both a
greater quantity and different quality of experience than do youths. Simply by virtue
of having lived longer, adults have a greater accumulation of experiences from which
to draw. Furthermore, the types of experiences that adults have had are different as
well. The accumulation of experience means that a group of adult learners will
typically be quite heterogeneous in terms of background and experience and will
benefit from individualized instruction that capitalizes on these experiences and use
them a teaching resource.
It is also important to realize that as adults mature and gain experience, they
also begin to define themselves in terms of their life experiences; their experiences
represent who they are. To ignore or undervalue these experiences can be perceived
as a rejection of the learner as a person. Successful teacher development programs
27
will respect the variety of experience that participants possess and will utilize these
experiences as resources in the learning process.
Readiness to learn. Adults are ready to learn those behaviors, ideas, and
tasks that will enable them to better cope with the situations they encounter within
their daily lives. Knowles notes that timing is everything; developmental tasks that
are associated with moving from one developmental stage to another represent
unique opportunities to capitalize on the adult learner’s readiness to learn. This
assumption can be applied when examining how schools and districts structure
professional development for teachers. Rather than engage in professional
development simply to obtain credit for doing so, teachers should be encouraged to
create cohesive professional development plans that are tailored to their current
developmental career stage.
Orientation to learning. Related to readiness to learn is an adult’s
orientation to learning. Adults tend to be task-, problem-, and life-centered in their
orientation to learning. They are motivated to learn those things that they perceive
will help them to increase their performance and competency at important life tasks.
Similarly, adults learn more effectively when new knowledge is presented in the
context of application to real-life situations. Professional development programs
should offer teachers practical, problem centered activities that help participants to
connect learning to current practice.
Motivation to learn. Finally, like children, adults respond to external
motivators such as job promotion and increased salary. However, the most powerful
28
motivators for adults are intrinsic in nature. Knowles noted that factors such as job
satisfaction, self-esteem, and quality of life represent strong intrinsic motivation for
learning. Finally, professional development programs should be designed to
maximize intrinsic learner motivation.
Criticisms of Andragogy
There has been extensive criticism of andragogy as an “organizing principle
in adult education” (Davenport & Davenport, 1985). Brookfield (1986) notes
criticism by Jarvis and others which argue that andragogy lacks the empirical support
necessary to justify its position as educational doctrine. A central criticism of
andragogy as a comprehensive learning theory is that in some contexts both children
and adults can display self-directed learning, but more importantly many adults do
not display self-directedness. Brookfield (1986) notes that his main concern is not
that children can display self-directedness, but that there is a lack of self-directedness
amongst adults in many cultures. Self-directedness appears to be a result of maturity
and culture as opposed to an innate characteristic of adults.
Other theorists suggest that rather than being considered a separate theory,
andragogical principles are best placed under the umbrella of pedagogy (Merriam,
1981). Milligan’s (1995) comparison of andragogy to Freire’s Pedagogy of the
Oppressed notes similarities between the two theories; in particular the idea of a
horizontal relationship between teacher and student and problem-centered learning.
He suggests that the two theories are related and can be subsumed within an
overarching pedagogical theory of learning.
29
Finally, a number of theorists who apply critical theory to the study of
education have criticized andragogy for its focus on the individual and lack of a
critical social agenda. Still others have sought a theory of adult learning that reaches
beyond the process of learning to encompass desired outcomes such as perspective
transformation. The common thread amongst these criticisms is the andragogical
focus on individual growth and change and lack of attention to how these changes
relate to societal change. Knowles’ acknowledged the tension between those who
believe the goal of education is individual growth and those who believe that the
overarching goal of education should be to improve society (Knowles et al., 2005).
Support for Andragogy
Despite the criticisms leveled against it as a comprehensive theory, “the
concept [of andragogy] remains the single most important contribution to a uniquely
adult theory of teaching and learning” (Brookfield, 1984, p. 189). Brookfield
suggests that rather than argue about whether andragogy is a comprehensive theory
of education, it is useful to recognize that the principals and assumptions of
andragogy and pedagogy are appropriate, regardless of age, at different times and for
different purposes. He notes that as Knowles’ also arrived at this conclusion as his
ideas evolved.
Knowles’ originally conceived of andragogy as an integrated set of
assumptions that represented the antithesis of pedagogy. Pedagogy emphasizes the
experience of the teacher in planning and transmitting knowledge while andragogy
places importance on the experiences of the learners as the richest resource for
30
learning (Ingalls & Arceri, 1972; Knowles et al., 2005). Table 1 summarizes the
differences between pedagogical and andragogical assumptions identified by
Knowles, Holton and Swanson (2005).
Table 1
Comparison of Pedagogical and Andragogical Assumptions
Assumption Pedagogy Andragogy
The need to know
Need to know what the
teacher wants them to learn
on order to succeed, not the
how what they learn will
apply to their lives
Need to know "why" they need
to learn (i.e., the “value” to the
learners life)
Concept of the
learner
Dependent on the teacher
for knowledge
Self-directed, independent
learner
Role of the
learner's
experience
Teacher’s experience is
valued and must be
transmitted to the learner.
Assumes volume and quality
of experience. Emphasis is on
using learner’s experience as a
resource.
Readiness to learn
Learners become ready to
learn what they are told to
learn
Ready to learn things they
need to know and be able to do
to effectively manage real-life
situations
Orientation to
learning
Subject-centered
orientation
Life-centered, task-centered, or
problem-centered orientation
Motivation to learn
Externally motivated
(e.g., grades,
approval/disapproval)
Most potent motivators are
internal (e.g., self-esteem, job
satisfaction) but are also
motivated by external factors.
Note. Adapted from “The adult learner: The definitive classic in adult education and
human resource development,” by M.S. Knowles, E.F. Holton, and R.A. Swanson,
2005, loc 724 – 777.
31
However, as these andragogical assumptions were applied in practice it
became evident that some assumptions could be applied towards the education of
children, and conversely that there were situations in which these assumptions may
not be an appropriate framework when working with adults. Based on feedback
from educators, Knowles reconsidered his belief that andragogy and pedagogy were
dichotomous principles. He reconceptualized andragogy as a system of alternative
sets of assumptions to pedagogy; an individual-transactional model that speaks to the
characteristics of the learner and the learning situation. Educators are responsible for
testing which assumptions, andragogical or pedagogical, are realistic in any given
situation. In some cases, with some learners, it may be more appropriate to apply
pedagogical assumptions, while in other cases the assumptions of andragogy may be
more appropriate. The flexibility of andragogical theory emphasizes the
responsibility of professional development facilitators for evaluating the needs of the
participants to effectively tailor the learning opportunities offered. The need for
flexibility is especially apparent in the literature regarding effective features of
professional development. This will be discussed in the next section.
Knowles also addressed the objections to andragogy raised by critical
theorists, noting that they were correct in their observations that andragogy does not
specifically address learning outcomes such as social change. As the concept of
andragogy evolved, Knowles came to understand andragogy as a flexible set of
alternative assumptions about adult learning rather than a unifying theory of adult
education. Knowles argued that andragogy is best understood as a transactional
32
model of adult learning that can be applied across a variety of settings, to meet
varied learning goals. The main focus of andragogy, which is rooted in pragmatic
and humanistic philosophy, is the learning process itself, not the outcomes of
learning. This argument suggests that criticisms of andragogy which focus on why
adult learning programs are conducted are not applicable to andragogy. Andragogy
does not address the goals of education but is instead concerned with how learning
occurs.
Applying Andragogy: Andragogy in Practice
The study of andragogy provides insight into what an effective professional
development program might look like. However, a criticism of andragogical theory
is that it offers little guidance as to how these ideas can be applied in specific
contexts, which have been noted to be an important factor to consider when
implementing any professional development program (Knowles, Holton, &
Swanson, 2005).
Knowles et al. (2005) assert that the fact that andragogy does not speak to all
the goals and purposes of learning is a strength rather than a weakness. In fact, the
power of andragogy lies in its potential for flexible application across any and all
learning environments. Knowles intended his six assumptions to have enough
flexibility to be altered to fit the needs of each particular learning situation; educators
have the option to adopt the system whole or in part. As noted earlier, Knowles
placed the responsibility on the educator for testing the appropriateness of each
assumption for the given situation. To assist adult educators in applying the
33
principles of andragogy, Knowles et al. (2005) created the Andragogy in Practice
Model, which recognizes the heterogeneity of learners and learning contexts. This
model provides a framework that will assist educators to systematically adapt and
apply andragogical principles across multiple learning contexts. This study used the
Andragogy in Practice model to analyze the design of the CB Process training.
The Andragogy in Practice model consists of three learning process
dimensions which interact with one another: (1) the six core assumptions of
andragogy, (2) individual and situational differences, and (3) goals and purposes for
learning. Figure 1 provides an illustration of the Andragogy in Practice model and
the related contextual variables that were examined in the study. This section will
provide an overview of the model as well as a more detailed discussion around
specific variables and concepts which are important to this study.
34
Figure 1. Andragogy in Practice model. Adapted from “The adult learner: The
definitive classic in adult education and human resource development,” by M.S.
Knowles, E.F. Holton, and R.A. Swanson, 2005, loc. 1656.
The Innermost Ring: Core Learning Principles
When using this model to think about professional development, the
innermost ring, which focuses on the six main assumptions of adult learners
discussed earlier, should be used to inform the design of effective learning activities.
Knowles, Holton, and Swanson (2005) assert that without any other information,
these six assumptions provide a solid foundation for planning adult learning
experiences. The current study utilized qualitative interview data regarding the
Individual Learner Differences
• Survey: Education Level
• Survey: Teaching Experience
Situational Differences
• Interview: CBECE Context
Subject Matter Differences
• CB Process
Individual Growth
• Survey: Teacher Efficacy
• Survey: Teacher Knowledge
Institutional Growth
• Survey: Change in Classroom
Practices
• Not Measured: Student Outcomes
Societal Growth
• Not Measured: Student Outcomes
35
purpose, design, and implementation of the CB Process training to determine the
extent to which these factors were or were not present. Follow-up survey data were
also used to provide insight into these factors.
The Middle Ring: Individual and Situational Differences
The middle ring of the Andragogy in Practice model represents individual
and situational differences. These differences are conceptualized as variables that
will affect which assumptions are appropriate for the learning situation. Depending
on the context, some assumptions may be emphasized or prove more effective than
others. Knowles et al. (2005) identify three categories of differences that will be
discussed here: (1) individual differences, (2) situational differences, and (3)
subject-matter differences.
This type of contextual analysis is supported by other strands of research;
both Penuel et al. (2007) and Gusky (2003) note the necessity of studying how
individual and contextual variables affect the delivery of professional development
and its effectiveness. Demographic data gathered from the pre-test survey were used
to inform this level of the model.
Individual differences. Individual differences represent the unique
experiences and characteristics that each learner brings to the learning experience.
Knowles and colleagues (2005) note that the area of individual differences in one in
which our understanding of adult learning may have grown the most in recent years.
It is beyond the scope of this chapter to delineate all the individual differences that
may affect learning. However, this section will focus on two specific variables that
36
are often considered in the study of effective professional development. These
variables are: (1) education level and (2) years of teaching experience.
Education level. As noted in Chapter One, the recently released NAEYC
Position Statement on Developmentally Appropriate Practice (DAP) recognizes the
growing awareness amongst both policy-makers and the public that early childhood
education (ECE) experiences can provide a strong and essential foundation for future
learning (Copple & Bredekamp, 2009). Participation in high-quality ECE can
produce long-lasting and significant educational and social benefits for young
children (Barnett, 2003; Bowman, Donovan, & Burns, 2001; Copple & Bredekamp,
2009; NAEYC, 2003). However, these findings are true only of early childhood
programs of high-quality. The general consensus in the field is that the positive,
long-term effects of preschool education on student outcomes occur only in
programs where the teachers have earned a baccalaureate degree with specialized
ECE training (Barnett, 2003). Highly-trained teachers possess the pedagogical
knowledge necessary to recognize the various learning styles, needs, and strengths of
the diverse student population (W. Barnett, 2003; Bowman, Donovan, & Burns,
2001; Copple & Bredekamp, 2009). This expertise allows teachers to tailor
curriculum to suit the particular needs of their students (Barnett, 2003; Bowman et
al., 2001; Copple & Bredekamp, 2009; NAEYC, 2008).
Based on analyses of data from the National Center for Early Development
and Learning’s (NCEDL) Multi- State Study of Pre-Kindergarten, Early et al. (2006)
question the validity of this commonly held belief that a Bachelor’s degree is
37
necessary to ensure program quality. Operating under the assumption that the
provision of high-quality early education requires that teachers are knowledgeable
and highly skilled, Early et al. (2006) explored whether or not our educational and
credentialing systems are actually producing and identifying these types of
professionals; does a Bachelor’s degree guarantee that a teacher is prepared to teach?
The authors operationalized teachers’ education and credentials in six
different ways in order to conduct a fine grained analysis of the relationship between
teachers’ education and classroom quality and students’ academic gains. Teacher
education was operationalized in the following ways: years of education, degree,
attainment of a Bachelor’s degree, ECE specific training, CDA, and state teaching
license. Early et al.’s (2006) analysis demonstrated a consistent link between
teachers’ education, specifically attainment of a bachelor’s degree, and student gains
in math skills and scores on the Teaching and Interactions subscale of the Early
Childhood Environment Rating Scale – Revised (ECERS-R). However, there were
no associations between teacher education and any other measures of classroom
quality or other academic gains, such as literacy development, across the school year.
The authors make the argument that a Bachelor’s degree may be necessary, but not
sufficient for the provision of high-quality preschool experiences.
While the Early et al. study indicates that further examination of the effects
of preschool teacher training is necessary, caution should be used when interpreting
the results. Early et al. (2006) examine classroom and student outcomes over the
course of a single school year. However, many of the demonstrated positive effects
38
of preschool education are long-term effects, such as to lower levels of special
education placements, lowered rates of grade retention, and lowered dropout rates
(W.S. Barnett, 1985; W. Barnett, 2008; Frede, Jung, Barnett, & Figueras, 2009;
Magnuson & Waldfogel, 2005; Nores, Belfield, Barnett, & Schweinhart, 2005)
which are not captured by the measures used in the study. Furthermore, by using a
stratified random sample of only state-funded preschool programs, many of which
provide services only a few hours per week, the data may not be wholly
representative of the wide range of programs and systems across the states sampled.
More research is necessary before generalizing study results to other states or
prekindergarten programs.
Early et al.’s (2006) results raise important questions and concerns for early
childhood educators, highlighting knowledge gaps in the research literature.
Although most of the research literature demonstrates a link between global quality
in ECE and higher levels of teacher education, Early et al. (2006) note that teacher
education is not commonly operationalized, leaving gaps in our understanding of
what specific features of teacher education are important for ensuring ECE program
quality.
The authors suggest that simply setting and maintaining teacher certification
standards is not sufficient to ensure program quality and increased child outcomes.
Early et al. (2006) suggest that the field of ECE may benefit from focusing on the
specific content of professional development programs to ensure that teachers have
39
the level of expertise necessary to ensure the delivery of high-quality early childhood
education.
Teachers at KS possess substantially higher levels of educational than
reported by the ECE field as a whole. This study considered the possible effects of
education level in the analysis of the CB Process Training design.
Years of teaching experience. Many studies of professional develop utilize
years of teaching experience as an independent variable. This is based on career
stage theory, an outgrowth of lifespan development theories. Huberman’s (1989)
career stage model is one that has found wide acceptance in the literature and
provides a thorough description from beginning to end of the teaching career. The
model has been frequently used to provide a lifespan perspective when interpreting
the results of empirical study (Anderson & Olsen, 2006; Brekelmans, Wubbels, &
van Tartwijk, 2005; Choi & Tang, 2009). According to career stage theory, teachers
progress through a series of stages that represent major phases of career
development. The interests, attitudes, and needs of teachers differ as they move
from stage to stage. It is important to note that the model may not apply to all
teachers in the same way, individual teachers vary in how they progress through the
stages and in whether they pass through some stages at all (Villegas-Reimers, 2003).
Despite these limitations, career stage theory offers a starting point for understanding
the needs of teachers at different stages of their careers. Huberman’s model
describes five consecutive stages of development: (1) career entry, (2) stabilization,
(3) first divergent period, (4) second divergent period, (5) disengagement.
40
Villegas-Reimers (2003) provides a summary of the various stages. The first
stage, Career Entry, spans the first 1 -3 years in the profession. This represents a
period of both survival and discovery for new teachers. The 4
th
– 6
th
years of a
teacher’s career represents the second stage, Stabilization. During this time teachers
begin to achieve a sense of teaching mastery and make a commitment to teaching as
a career. Teachers in their 7
th
– 18
th
years of teaching experience a divergent period.
At this stage of the teaching career model teachers may take one of two paths.
Huberman (1989) notes that some teachers experience a time of experimentation and
activism as they try out new ideas, develop their philosophy of teaching, and
confront institutional barriers. These activities may lead to increased activism within
the school community and result in additional professional responsibilities and/or
promotion. Other teachers, who do not engage in experimentation, may experience a
period marked by self-doubt and reassessment resulting in a decision to leave the
teaching profession. Huberman’s fourth stage, spanning the 19
th
to 30
th
years of a
teaching career, represents yet another divergence. Teachers at this stage often
experience either serenity or conservatism. Serenity represents a greater sense of
self-acceptance, but also a decline in engagement and career ambitions. On the other
hand, teachers experiencing conservatism become skeptical of innovation and critical
of the education system and profession. The final stage of the teaching career covers
year 31 – 50. At this stage teachers undergo a gradual separation from the profession
as they prepare for retirement. For some this period is marked by reflection and
serenity, while others experience bitterness and regret.
41
The career stage model can be used to inform professional development
efforts, helping facilitators to better assess the needs and interests of their students.
Richter, Kunter, Klusmann, Lüdtke, & Baumert (2010) utilized empirical
data from the 1999 – 2000 U.S. Schools and Staffing Survey to show the relationship
between Huberman’s stage theory and the types of professional activities that
teachers engage in throughout their careers. The authors found that during career
entry teachers participated in both formal and informal professional devlopment
activities. These teachers participated more frequently than any other group in
mentoring or peer observation, frequently attended formal activities such as
conferences and workshops, and continued to attend university courses in their main
teaching subject. Beginning teachers pursued professional development
opportunities that focused on classroom management and student discipline; issues
that represent common challenges for new teachers. Teachers in the third stage of
career development also frequently attended formal workshops and conferences;
unlike peers at other career stages, these teachers were more involved in peer
collaboration, individual collaborative research, and observational visits to other
schools. Teachers in stage three most often pursued professional development
opportunities that related to their teaching subject, content and performance
standards, and teaching methods. These figures support the theoretical proposition
that many teachers at this stage of career development are focused on experimenting
and expanding their repertoire of skills.
42
SASS data indicate that teachers with more than 20 years of experience
demonstrated lower participation rates across all types of learning opportunities. In
particular, these teachers demonstrated the lowest rates of participation in university
courses in their subject areas, recertification, and advanced certification. They also
demonstrated lower participation in all content areas but two, the instructional use of
computers and student assessment. These findings also support the Huberman model
which posits the decline of teacher engagement in the later years of a teacher’s
career. It is important to note, however, that although Huberman’s last two stages
span the 19
th
– 50
th
years of the teaching career, the longest career range specified by
SASS data is 20 years or more. This may be an artifact of the survey data that were
used or it may indicate that teaching careers rarely extend more than 20 years and the
theory should be refined to address more realistic career ranges.
This study considered the experience level of teachers, both in early
childhood education and at the organization, when evaluating the design of the CB
process Training.
Situational differences: context matters. As noted in both in Chapter One
and earlier in this chapter, context is an important factor to consider when
implementing any professional development activity. Guskey (2000) discusses the
impact of context on school reform efforts. He notes that each school’s culture is
highly contextualized. One-size-fits-all reforms that fail to take the unique cultures
and contexts of schools into account often result in failure. On the other hand,
reforms that find ways to adapt to and even capitalize on context are often
43
successful. Likewise, professional development activities that hope to create
changes in teacher practice, should consider the unique contexts in which
professional development occurs.
This study utilized information gathered through an interview to provide
insight into the unique characteristics of the KS Context that relate to the design of
the CB process training.
Subject matter differences. Knowles, Holton, and Swanson (2005) note
that subject matter differences may require different learning strategies. The authors
use the example of complex technical subject matter, noting that self-directed
learning strategies may not be the most effective approach. Because the training
addressed in this study examines a complex process, it is important to explore the
findings regarding training professionals to complete a process rather than simply
teaching a new skill.
Although the literature surrounding effective training of Positive Behavior
Support (PBS) Procedures is sparse, a recent study of the PBS Training Curriculum
(PBS-TC) by Kinkaid, Peshak George, and Childs (2006) indicates that the current
PBS-TC teaches many of the skills necessary for participation in PBS, but does not
adequately teach about the PBS process itself. The authors suggest that the training
was not comprehensive enough to prepare the participants to adequately participate
in the collaborative PBS process. In order to provide teachers with a clear picture of
the complete process the authors suggest that a hands-on case study approach,
wherein participants are able to observe and perform each step in the process, should
44
be utilized. Furthermore, they foresee the need to provide follow-up support for
teachers in the form of additional curricula addressing critical issues as well as
support measures for teachers as they implement the process in their classrooms
(Kincaid et al., 2006).
Kincaid et al.’s findings seemed to contradict the findings of several
laboratory studies of Functional Behavior Analysis (FBA) and behavior intervention,
processes used in the PBS approach. These studies had indicated that “crash course”
trainings are sufficient to promote proficiency in implementation (Iwata et al., 2000;
Moore et al., 2002). However, a number of recent field studies have challenged this
belief and provide support for Kinkaid et al.’s recommendations. In a study
examining the implementation of Functional Behavior Assessment (FBA) processes,
Dukes, Rosenberg, and Brady (2008) compared the performance of teachers who
participated in an initial workshop style training to teachers who received no
training. The authors found that despite being able to answer knowledge-based
questions about FBA more accurately, teachers who attended training demonstrated
no statistical difference in their ability to provide strategy recommendations for
promotion behavioral change. Dukes et al. (2008) hypothesize that the format and
delivery of the professional development training did not allow teachers the time or
opportunity to develop fluency in implementing new skills prior to using these skills
in the classroom.
Similar to the findings of Dukes et al. (2008), Scott, Nelson, and Zabala
(2003) found that “crash course’ type trainings on FBA and behavior intervention
45
were insufficient to prepare teachers to perform at the level of best practice. Longer,
more intensive training is necessary to develop the expertise necessary to go beyond
the acquisition of discrete sills to the ability to coordinate these skills to complete a
complex process.
Finally, Mayer et al. (2007) examined the effects of behavior analysis
training on the ability of teachers to create and implement support plans for students.
Teachers who received training on both the key concepts of behavior analysis and
how to evaluate and rate the quality of PBS plans were more than four times as likely
to develop good or superior plans as teachers who received training on the key
concepts alone. The study results suggest that the additional training gave teachers a
chance to integrate the skills learned in the initial training, providing them with a
more complete understanding of the process. The study results support the findings
of Kinkaid et al. that active, in-depth learning opportunities are necessary for
teachers to develop expertise in implementing a process as opposed to learning
discreet skills.
The CB Process training is a brief, one-time workshop that is a part of a
larger, ongoing training effort. Information about the workshop itself and Follow-up
survey data that focuses on the types of supports experienced by teachers provided
useful information used to evaluate the design of the training.
The Outer Ring: Goals and Purposes for Learning
The outermost ring, goals and purposes for learning, refer to the desired
outcomes of professional development. Knowles, Holton, and Swanson (2005)
46
indicate that these goals should be clearly identified because they are, “the frame that
shapes the learning experience” (loc. 1736). This echoes Guskey’s (2000) assertion
that criteria for effectiveness of professional development must be identified in order
to begin to understand what professional development looks like. Knowles (2005)
identifies three categories of goals and purposes: (1) Individual Growth, (2)
Institutional Growth, and (3) Societal Growth. Although andragogy specifically
concerns itself with the process of learning rather than the outcomes of learning,
Knowles acknowledges that in applying andragogy, goals and purposes help to shape
the learning experience. All three categories will be discussed here, however it is
beyond the scope of this dissertation to examine and measure all three categories.
The outcomes measured in this study were related to individual growth. However,
Institutional and Societal Growth outcomes were considered in the evaluation of the
CB process training design.
Individual growth. Knowles notes that adult learning scholars traditionally
view the goals of learning in terms of individual growth (Knowles et al., 2005). In
fact, at first glance, andragogy’s focus on individual learner characteristics would
support this focus. This study will address two areas of individual growth: (1)
Teacher Efficacy and (2) Teacher Knowledge.
Teacher efficacy. The concept of self efficacy as conceived by Bandura
(1977) is defined as a person’s belief about their ability to “organize and execute the
courses of action required to produce given attainment” (Bandura as cited in
Tschannen-Moran, Woolfolk Hoy, & Hoy, 1998, p. 207). Self efficacy is further
47
separated into two forms: efficacy expectations and outcome expectations. Efficacy
expectations refer to an individual’s belief in his or her ability to successfully
perform a task. Those with high efficacy expectations believe in their ability to
perform a task at a given level and therefore will expend more effort, display greater
persistence in the face of obstacles, exhibit greater resiliency in the face of failure,
and cope better with stress in demanding situations (Bandura, 1997). Outcome
expectations refer to an individual’s estimate of the consequences of performing
given their level of competence (Bandura, 1986). Bandura (1986) noted that
outcome expectancy judgments can provide strong incentives or disincentives for
engaging in particular behaviors.
It is important to note that efficacy is separate and distinct from other
concepts of self such as self-esteem, which reflect overarching feelings of self worth
or self liking. A person my feel inefficacious for a particular task, yet have a healthy
self-esteem. On the other hand an individual may display a high level of skill
(effectiveness) yet evaluate themselves negatively because they have set personal
standards that are difficult to meet. Self-efficacy relates to the perception of
competence rather than actual ability. This is an important distinction, because over
or under estimating one’s ability may have consequences for the decisions people
make to pursue a specific course of action or to expend effort towards a certain
pursuit. In most cases, slightly overestimating one’s competence has the most
positive effects on performance; individuals tend to be more persistent and expend
greater effort.
48
The concept of self-efficacy has been shown to have significant impact in the
field of education. Teacher efficacy is described by Tschannen-Moran and Woolfolk
Hoy (2001) as one’s belief that he or she is capable of bringing about desired student
outcomes, even when students are challenging or unmotivated. Research has
demonstrated that teachers’ sense of efficacy is positively related to a wide range of
student outcomes, as well as teacher beliefs and behaviors (Armor, 1976; Berman &
McLaughlin, 1977; Dembo & Gibson, 1985; Ghaith & Yaghi, 1997; Gibson &
Dembo, 1984; Tschannen-Moran & Woolfolk Hoy, 2001; Tschannen-Moran et al.,
1998).
This idea first emerged in the 1976 with the publication of a Rand
Corporation evaluation study that sought to identify school and classroom factors
related to increasing student reading scores (Armor, 1976). Among the many factors
measured in the Rand study, the simple two-item measure used to evaluate efficacy
was one of the most powerful factors uncovered; teachers’ beliefs in their own ability
to influence student outcomes was significantly related to improved reading scores
(Armor, 1976). A second study found that teacher efficacy was significantly related
not only to student achievement but also to the continued use of federally-funded
innovations even after funding had ended (Berman & McLaughlin, 1977).
Later research has shown that teachers with higher teaching efficacy are more
likely to embrace and implement innovative classroom techniques, demand higher
end-of-the-year goals for their students (Allinder, 1995), are less controlling of their
students, and welcomed being active members of the school community (Woolfolk
49
& Hoy, 1990). Teachers with higher teaching efficacy also demonstrate greater
persistence in the face of challenges and are more resilient when dealing with
setbacks (Ross & Bruce, 2007; Tschannen-Moran & Woolfolk Hoy, 2001).
Furthermore, teachers with high self-efficacy are more likely to welcome
consultation when dealing with students, feel in control of situations, and have lower
referral rates when teaching difficult students (DeForest & Hughes, 1992; Meijer &
Foster, 1988; Podell & Soodak, 1993). Finally self efficacy is an important
determinant of job satisfaction and is negatively correlated with teacher burnout
(Caprara, Barbaranelli, Steca, & Malone, 2006; Skaalvik & Skaalvik, 2007).
Particularly important to the study of professional development is the
research that examines the relationship between self efficacy and innovation and
change. Tschannen-Moran, Woolfolk Hoy, and Hoy (1998) note that change, even
when for the better, can be stressful and difficult. In these cases the development of
teaching efficacy in regards to new innovations appears to be curvilinear (Allinder,
1995; Guskey, 1988). The initial challenges of implementation have a negative
effect on efficacy; however, as teachers develop strategies to cope with change and
see evidence of increased student learning, teaching efficacy increases. This
curvilinear development is addressed in Guskey’s (2000) Model of Teacher Change
which posits that changes in teacher attitudes and beliefs occur after implementing
newly learned practices and observing the changes to student learning that result.
Both the model of teacher change and the studies by Guskey (1988) and
Allinder (1995) have two significant implications for the study of professional
50
development. First, encouragement and support for the implementation of new
practices is particularly important in the early stages of implementation to counteract
the initial slump in self-efficacy. Providing ongoing support can increase the
likelihood that teachers will experience increased student outcomes and the higher
self-efficacy beliefs that can result. Second, measures of teacher efficacy for new
learning may be more accurate after teachers have an opportunity to implement
classroom changes and observe student outcomes.
Teacher efficacy was considered both in the evaluation of the CB Process
training design and in the evaluation of the trainings effectiveness.
Teacher knowledge. One of the inherent goals of professional development
activities is to increase teachers’ understanding of the ideas or principles that are
being taught. Unfortunately, as noted by Kirkpatrick (D. L. Kirkpatrick &
Kirkpatrick, 2007), most evaluations of professional development fail to go beyond
measuring participants’ reactions to the training. Penuel et al. (2007) further note
that when teacher knowledge is assessed, as was the case in the Eisenhower Study
(Desimone, Porter, Garet, Yoon, & Birman, 2002; Garet et al., 2001), self-report data
is often the only means of assessment. Penuel et al. (2007) go on to argue that while
there is often good agreement between teacher self-report and observations, teachers
may be biased towards endorsing desirable practices or understandings whether or
not they actually utilize those practices in the classroom. It is important to include
objective assessments of teacher knowledge when evaluating this particular outcome.
51
This study represents a step forward in how the organization measures the
outcomes of its professional development programs. Pre- and post- test surveys
asked teachers to answer multiple choice questions that specifically measure their
knowledge and understanding of the CB Process. The data from this study provide
insight into the amount of information learned through the training itself.
Institutional and societal growth. As noted earlier, andragogy refers to the
process rather than the outcome of learning. However, Knowles acknowledged that
the process of learning occurs for a number of reasons; individual learning is
embedded within complex institutional and social contexts and may occur for the
purposes of advancing institutional or societal growth.
Knowles, Holton, and Swanson (2005) note that in addition to improving
individual performance, adult learning also has the potential to improve
organizations. In theory, the ultimate goal of professional development is to further
the goals of the institution. In the case of teacher professional development, the
institutional goals are to improve classroom practice and ultimately improve student
performance. As noted earlier, a great deal of evaluation of professional
development focuses on participant reactions. While, evidence does exist that
effective professional development can positively affect both transfer of learning to
classroom practice and student performance, this is clearly an area for future study.
Furthermore, studies that address the subject matter differences in professional
development indicate that in order to facilitate transfer, teachers will need in-depth
and ongoing professional development opportunities.
52
Learning may also occur with the end goal of improving society. This is best
illustrated by the work of Paulo Friere (1970). According to Friere the purpose of
education is to enable students to become aware of the oppressive forces in his or her
life and eventually become an active part in social change (Merriam, Caffarella, &
Baumgartner, 2007). Friere viewed adult education as a process of consciousness-
raising with the goal of societal transformation.
As mentioned earlier, the institutional and societal growth goals were used to
evaluate the design of the CB Process training. It was not within the scope of the
current study to collect outcome data pertaining to these goals. Therefore, they were
not considered when evaluating the effectiveness of the CB process Training.
Andragogical Learner Analysis
Knowles et al. (2005) suggest that the Andragogy in Practice framework
should be used in advance of the training to conduct an andragogical learner
analysis. They provide a worksheet created for this purpose (See Appendix A) that
determines the, “extent to which andragogical principles fit a particular situation”
(loc. 1737 -1743). The six core andragogical assumptions comprise the rows of the
analysis matrix. The columns of the analysis matrix are comprised of the six factors
contained in the outer two rings of the Andragogy in Practice model. Each cell of
the matrix represents the effects of one factor on a core assumption. The current
study used andragogical learner analysis as part of the qualitative analysis of the
design of the CB process training.
53
Characteristics of Effective Professional Development
The previous section discussed the relation between andragogy and
professional development. This section takes a step back and examines specific
design elements or features that are characteristic of effective professional
development. As noted in Chapter One, there has been relatively little rigorous
empirical research from which we can draw conclusions about the characteristics of
effective professional development (Garet et al., 2001; Guskey, 2009; Yoon et al.,
2007). Despite this lack of empirical research, increased scrutiny of the
effectiveness of teacher professional development has led to the publication of a
number of “lists” that identify characteristics of effective professional development
(Guskey, 2003). Using content analysis, Guskey (2003) analyzed the characteristics
identified in 13 well-known and widely-used lists and the evidence from which each
list was derived. His analysis indicated that some lists were derived from empirical
data, others were created as policy syntheses and based on a mix of empirical data
and anecdotal case studies, and other were based on broad reviews of the literature.
Furthermore, there does not appear to be any consensus regarding the criteria for
effectiveness, an argument explored in Chapter One. As a result of this scattershot
approach, while there is some overlap between the lists, the characteristics vary
widely in their frequency of inclusion in the lists and there was no characteristic that
appeared consistently in all of the lists. Guskey (2003) concludes that given the
complexity inherent in the delivery and evaluation of professional development, a
single list of effective characteristics of professional development may never emerge.
54
However, Guskey (2003) goes on to argue that educators can make progress
toward identifying characteristics of effective professional development. To do so,
educators must first reach consensus that student learning is the ultimate criteria for
effectiveness. Second, the complexities of context and environment must be
considered as well. As mentioned in the previous section, Knowles placed great
responsibility on the facilitator to assess the needs of the adult learner and to then
tailor the learning experience to meet those needs. This fits with Guskey’s emphasis
on the importance of context. There may be a number of features that describe
effective professional development; the relative impact of each of these features may
be dependent on the particular context and characteristics of the participants.
In order to provide teachers with quality professional development programs,
it is important that program designers and providers have an understanding of the
features of effective programs and how these features may be impacted by context.
This section will focus on two studies, in particular, that provide empirical support
for one of the lists analyzed by Guskey (2003).
Six Features of Effective Professional Development
One of the lists analyzed by Guskey (2003) was constructed as part of a
national evaluation of the Eisenhower Professional Development Program. Garet,
Porter, Desimone, Birman, and Yoon (2001) draw from the research that has been
conducted and the contributions of expert practitioners for guidance about the
characteristics of effective professional development. From their review of the
literature, the authors identified three core features of teacher professional
55
development: (1) focus on content knowledge; (2) opportunities for active learning;
and (3) coherence with other learning activities. The authors further identified three
structural features of effective professional development: (1) the form of the activity
(e.g., workshop vs. professional learning community); (2) collective participation of
teachers from the same school, grade, or subject; and (3) the duration of the activity.
Using data collected as part of the national evaluation of the Eisenhower
Professional Development Program, the authors analyzed the responses of 1,027
math and science teachers to evaluate the relationship of these six characteristics to
teacher learning.
Analysis indicates that professional development that is characterized by the
three core features of professional development (i.e., focus on content knowledge,
active learning, and coherence) is more likely to result in increased knowledge and
skills. The three structural features of professional development (i.e., form,
collective participation, and duration) operate indirectly through the core features.
For example, reform type activities result in better outcomes mainly because they
tend to be of longer duration that traditional activities.
The six characteristics of professional development identified by Garet et al.
(2001) are further explored by Penuel, Fishman, Yamaguchi, and Gallagher (2007)
in their evaluation of the GLOBE earth science education program. Penuel et al.
(2007) note that the Eisenhower Study marked an important advance in the study of
professional development; it utilized data from a national probability sample of
teachers, providing empirical support for the six characteristics. These
56
characteristics had previously been identified in the literature, but had, until the
Eisenhower Study, limited empirical support. Penuel et al. (2007) go on to argue
that the Eisenhower Study, while groundbreaking, was limited by its breadth. The
breadth of the Eisenhower Study also limited researchers’ ability to examine the
relationship between the characteristics of professional development and objective
measures of teacher outcomes. Moreover, the size of the sample limited the authors’
ability to draw conclusions about what makes professional development effective
within particular contexts. This second argument mirrors Guskey’s (2000, 2003)
assertion, discussed Chapters One and Two, that context is an important factor when
studying the effectiveness of professional development.
To extend the work of Garet et al. (2001), Penuel and colleagues (2007) used
Hierarchical Linear Modeling (HLM) to explore the relationships between features
of professional development and objective measures of teacher outcomes. The
authors’ queried 454 teachers who received GLOBE training at 28 training sites over
a two-year period. The authors refined Garet et al.’s six characteristics to fit the
particular context of the study. Interestingly, correlations among the refined
variables and the original Eisenhower variables (Garet et al., 2001) with teacher
outcomes were consistent with the earlier results. In particular the authors
determined that the form of the activity, coherence, and collective participation
predicted changes in teacher knowledge and practice.
The authors conclude that the pattern of findings supports the usage of the
characteristics identified in the Eisenhower Study (Garet et al., 2001) as a useful
57
framework for examining what makes professional development effective.
However, the findings of this study also indicate that multiple studies are necessary
to determine what constitutes effective professional development. Studies of
different programs in different contexts will most likely yield distinct but
overlapping findings about the characteristics of effective professional development.
The unique contexts and features of each individual professional development
program may make some characteristics more or less important for effective
implementation. Like Guskey (2000, 2003), Penuel et al. (2003) note that when
determining how to implement professional development at the local level, there
should be a good fit between the program and the local context. Providers of
professional development should consider both the “teachers’ own contexts… [and]
the program’s demands on teachers and how those demands can be met within their
contexts” (p. 952).
Congruent with the theory of andragogy, the research on effective features of
professional development emphasizes the need for flexibility dependent on the
unique context of each learning opportunity. This study examined the extent to
which these features of effective professional development were incorporated into
the CB process training and examined the resultant effectiveness of the program.
Results of the study provide information for KS CBECE about what effective
professional development may look like in the KS context.
58
Evaluating Professional Development: The Kirkpatrick Framework
As discussed in Chapter One, Kirkpatrick’s (2001, 2006; D. L. Kirkpatrick &
Kirkpatrick, 2007) four-level evaluation model was originally created to evaluate the
effect of training programs in business and industry, but has since been applied in the
field of education to evaluate professional development (Guskey, 2000). The four-
levels of evaluation outlined by Kirkpatrick (2001, 2006; D. L. Kirkpatrick &
Kirkpatrick, 2007) are: (1) Reactions, (2) Learning, (3) Transfer, and (4) Results.
These four levels of evaluation are meant to provide a comprehensive evaluation of a
trainings effectiveness that goes beyond merely measuring participant responses.
The outcomes from each of the four levels can be used to support, extend, or explain
outcomes at other levels. This section provides a discussion of each of the four
levels.
Level One: Reactions
Level one evaluation examines how employees feel about a training program
and directly addresses motivation. Kirkpatrick (2006) explains the necessity of
evaluating participant’s reactions by noting that when participant reactions are
positive, the chances that learning will occur are often improved; learners often
increase their efforts at learning when the task is interesting. Motivational theory
supports this global assertion that students learn better when the material to be
learned is interesting to them (Mayer, 2008). Most evaluations of professional
development, to date, focus on this level of evaluation.
59
However, learning theorists caution against relying solely upon participants
reactions when measuring the effectiveness of professional development
opportunities. Clark and Estes (2008) note that professional development activities
which are enjoyable for employees may not be the types of activities that will result
in improved job performance. In fact, Kirshner (2006) emphasizes that less
experienced learners benefit more from task-specific, structured learning activities,
even though these may be less enjoyable.
Level Two: Learning
Therefore, it is important to utilize Level two evaluations, which assesses the
extent to which the training program causes learners to (1) acquire knowledge, (2)
learn new skills and / or increase their present skill level, or (3) to change their
attitudes. Kirkpatrick (D. L. Kirkpatrick & Kirkpatrick, 2007) asserts that changes in
behaviors and practice cannot be expected unless these learning objectives are
accomplished. Furthermore, if training programs fail to create changes in behavior,
assessing learning can help to diagnose the problem. For example, if assessment
indicates that learning did occur as a result of the training, then other factors may be
responsible for the unchanged behaviors. Kirkpatrick (D. L. Kirkpatrick &
Kirkpatrick, 2007) encourages the use of pre- and post- testing to determine how
much participants learned from the professional development activity and to ensure
that the program is meeting its instructional goals.
60
Level Three: Transfer
Level three evaluations measure transfer; did teachers’ classroom practice
change in response to the training. Kirkpatrick (D. L. Kirkpatrick & Kirkpatrick,
2007) notes that this level of evaluation may be the truest measure of a training
program’s effectiveness. Level three evaluations are not often utilized because
measuring transfer is a complex task that can be quite time-consuming. Transfer can
be difficult to measure, evaluators must consider when and where changes in practice
will be likely to occur, as well as, how these changes can be observed, measured and
captured.
Level Four: Results
Level four evaluations examine results; did participation in the professional
development program produce results in the form of student learning or successful
program implementation? This level of evaluation is what policy makers look for in
determining whether a program is delivering on its investments. Kirkpatrick (D. L.
Kirkpatrick & Kirkpatrick, 2007) points out that this final level is rarely measured
although it is arguably the most important. The goal of all professional development
is to improve student outcomes, yet there is relatively little research that examines
the relationship between participating in professional development and achieving
better student outcomes.
Applying the Kirkpatrick Framework
The current study focused on levels one and two of the Kirkpatrick (2006)
framework. Participants’ reactions to the training were collected using post-test
61
surveys and used to support evaluations of the CB Process training design. Learning
was measured using pre- and post-test evaluations of both teacher efficacy and
knowledge. The study also evaluated the relationship between participants’ reactions
(level one) and change in knowledge (level two). The research questions of the
study also addressed level three (transfer) but conclusions were limited by the scope
of the study. The use of Follow-up surveys permitted collection of self-report data
regarding teachers’ perceptions of their success at implementing the CB Process.
The scope and timing of the study limited the types of data that could be collected or
that was available for in-depth analysis of levels three and four. These levels of
analysis and suggestions for future study are discussed in Chapter Five.
Summary
In order to achieve the goals set by educational reforms, teachers need
effective professional development. However, there is a limited quality research that
educators can consult in order to create or choose effective professional development
programs. There have been a number of areas identified in the literature that must be
addressed in order to obtain a clearer and more nuanced understanding of what
constitutes effective professional development.
First, both the study of andragogy and the research on characteristics of
effective professional development provide a number of insights into what an
effective professional development program might look like. The six core
assumptions of adult learning: (1) the learner’s need to know, (2) self-concept of the
learner, (3) prior experience of the learner, (4) readiness to learn, (5) orientation to
62
learning, and (6) motivation to learn should be considered when designing any
professional development activity. While there are many competing lists of effective
features of professional development, the results of the Eisenhower Study provide us
with empirical support for six core features that can be studied across multiple
contexts. These core features, along with the six assumptions of adult learners can
form a foundation for planning effective professional development activities.
Second, the Andragogy in Practice model can be used to provide a useful
framework for applying these assumptions and features across different contexts.
Consistent with the literature, the Andragogy in Practice model, encourages
educators to consider and evaluate the effects of contextual variables on professional
development efforts. Variables that are of particular interest include teacher
experience and education level, subject-matter differences, and outcome variables
such as self-efficacy, teacher knowledge, and transfer of learning to practice.
Finally, Kirkpatrick’s (2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007) four
levels of evaluation offer both specific criteria to determine the effectiveness of
professional development and a practical framework for measuring the desired
outcome variables.
The use of the Andragogy in Practice model (Knowles et al., 2005) and
Kirkpatrick’s (2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007) framework to
study and think about professional development fits into the research agenda put
forth by Borko (2004) and discussed in Chapter One. Kirkpatrick’s framework
provides a common method of measuring effectiveness while the Andragogy in
63
Practice model provides a situative perspective that considers the four key elements
of the professional development system: (1) the professional development program,
(2) the teachers (i.e., participants), (3) the facilitator, and (4) the context in which the
professional development occurs. Multiple studies that examine how these pieces fit
together in different contexts and with different professional development programs
have the potential to provide nuanced data that can move educational research
towards the goal of providing effective, high-quality professional development for all
teachers. The focus of the current study in on level one of Borko’s research agenda
and on levels one, two, and three of the Kirkpatrick framework. The outcomes of the
study will be primarily used to enable KS to assess and improve the quality and
effectiveness of its professional development programs.
64
CHAPTER THREE
METHODOLOGY
Purpose of the Study
The purpose of this study was to evaluate the quality and effectiveness of a
professional development activity called the Challenging Behavior Process (CB
Process) training. The quality of the training was assessed using criteria from the
research literature on andragogy and effective professional development. The
effectiveness of the training was then evaluated using the Kirkpatrick (2001, 2006;
D. L. Kirkpatrick & Kirkpatrick, 2007) framework to determine the relationship
between participation in the training and the following outcomes: teachers’ self-
efficacy, teachers’ knowledge about the CB process, and teachers’ self-reported
ability to implement the CB Process. The study utilized a mixed methods approach
to examine these relationships. Both qualitative and quantitative methods were used
to assess the design and quality of the training itself. Quantitative methods were
used to evaluate levels one, two, and three of the Kirkpatrick framework: reactions,
teacher learning, and transfer.
Research Questions
The research questions guiding this investigation are:
1. Was the training designed to effectively meet the needs of teachers
according to andragogical theory and the criteria for effective
professional development?
65
a. Kirkpatrick Level 1 (Reaction): What were participants’ reactions
to the CB Process training?
2. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ perceived self-
efficacy?
3. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ understanding
of the CB Process?
a. Is there a relationship between participants’ evaluation of the CB
Process training and participants’ understanding of the CB
Process?
4. Kirkpatrick Level 3 (Transfer): Do participants report being able to
successfully implement the objectives taught in the CB Process training?
Methodology
This study used a mixed methods research design, which is defined by
Tashakkori and Teddlie (2003) as the use of both qualitative and quantitative data
collection and analysis techniques in either sequential or concurrent phases of the
study. The rationale for mixing methods of data collection and analysis is that
neither quantitative nor qualitative methods by themselves are able to fully capture
the complexity of a situation such as the design, implementation, and evaluation of a
professional development program (Creswell, 2003). By using both qualitative and
quantitative methods the scope and breadth of the study are expanded. To better
66
understand the benefits of the mixed method approach, this section will discuss the
qualitative, quantitative, and mixed methods approaches as well as the knowledge
paradigms that underlie each approach. Specific mixed method design issues will
also be discussed.
Quantitative research is based on a postpositivist paradigm in which causes
determine outcomes or effects (Creswell, 2003). The problems studied by
postpositivist researchers reflect this philosophy by focusing on using controlled
experiments to determine cause and effect relationships. Postpositivist believe that
the world is governed by laws or theories that can be tested and verified through use
of the scientific method. Knowledge is based on observation and measurement of
objective reality. Quantitative research concerns itself with isolating variables in
order to discover causal relationships. Validity and reliability in quantitative
research depend on random assignment, the careful construction of research
instruments, and standardization of procedures to determine cause and effect
relationships that are generalizable across different contexts and to ensure that the
study can be replicated.
In contrast, qualitative research represents an inquiry process of
understanding” where the researcher develops a “complex, holistic picture, analyzes
words, reports detailed views of informants, and conducts the study in a natural
setting” (Creswell, 1998, p. 15). Qualitative research is based on the constructivist
(Lincoln & Guba, 2000) or advocacy/participatory (Kemmis & Wilkinson, 1998;
Mertens, 2003) paradigms. Because researchers in these paradigms believe that
67
knowledge is socially constructed through an individual’s interactions and
engagement with the people and the world, qualitative research emphasizes the
importance of the researcher immersing himself in the everyday life of the research
setting (Creswell, 2003). Qualitative research is inductive and meaning is generated
from data collected in the field and analysis based on the perceptions and values of
the participants (Patton, 2002).
Researchers using a mixed method approach operate from a pragmatic
paradigm. Pragmatists are concerned with solutions to problems and “what works”
(Creswell, 2003). Therefore research methods, variables, and units of analysis are
chosen which are most appropriate for finding an answer to the research question
(Tashakkori & Teddlie, 1998). Pragmatists believe that qualitative and quantitative
methods are compatible and that the use of both, either sequentially or concurrently,
will provide the best understanding of the phenomena of study (Creswell, 2003).
Tashakkori and Teddlie (2003) note that, unlike mixed model research
designs, mixed methods designs are only marginally mixed. Mixed model research
is mixed in most or all stages of the study from research questions to data analysis
and conclusions. However, in mixed methods designs qualitative and quantitative
methods are used only in the data collection and analysis; frequently the type of
questions asked and the inferences that are made at the end of the study are either
qualitative or quantitative (Tashakkori & Teddlie, 2003, p. 11). This study utilized a
mixed method approach in which the data collection and analysis employed both
68
qualitative and quantitative methods, but the research questions and implications
were primarily quantitative in nature.
Greene, Caracelli, and Graham (1989) identify five purposes for conducting
mixed method studies; each with its own unique design considerations. In mixed
method studies with expansion purposes, the goal of using mixed methods is to
“multi-task”; the scope and breadth of a study are expanded by utilizing multiple
methods of data collection. The authors recommend two elements essential for a
mixed method design with expansion purposes, the empirical work should be
encompassed within a single study and the phenomena should be distinct.
“Phenomena” refers to the degree to which qualitative and quantitative methods are
used to assess totally distinct phenomena or the exact same phenomena. Evaluation
studies, similar to the current study, often use methodology associated with an
expansion intent; qualitative methods are used to explore process and quantitative
methods are used to measure program outcomes (Green, Caracelli, & Graham, 1989,
Tashakkori & Teddlie, 1998). The authors note that expansion designs have the
potential to provide researchers with the data to make strong inferences and
conclusions. However, at this time many expansion designs have had difficulty
integrating qualitative and quantitative data; appearing more parallel rather than
expanded.
Finally, when designing a mixed methods study, Cresswell, Plano Clark,
Gutmann, and Hanson (2003) note three issue that should be considered: priority,
implementation, and integration. Priority refers to which method, qualitative or
69
quantitative, is given primary emphasis in the study. Implementation refers to
whether qualitative and quantitative data collection and analysis are conducted
sequentially or concurrently. Integration refers to the phase of the research process
in which the integration of qualitative and quantitative data occurs. This study will
utilize a concurrent nested design in which qualitative and quantitative data will be
collected simultaneously. Cresswell et al. (2003) note that in this type of study
different methods may be used to study different groups or levels within the design;
a method which Tashakkori and Teddlie (1998) call a multilevel design. Qualitative
and quantitative data are integrated during the analysis phase of research.
In this study quantitative surveys that assessed teacher efficacy, teacher
learning, and transfer were administered to the teaching staff. The director of the
CBECE Instructional Branch, who is responsible for teacher training efforts, was
interviewed qualitatively to explore both purposes and goals for the activity (See
Appendix B). Qualitative document analysis was used to review the training
materials (See Appendix C). Qualitative methodology was also used to analyze
participants’ open-ended responses to survey questions. Quantitative analyses
provided information about the content and format of the training that was studied
and informed the conclusions drawn from the quantitative analyses. The rationale for
this approach was to expand the scope of the study by using multiple methods
(Greene et al., 1989) to better understand the relationship between process and
outcomes of the professional development activity.
70
Setting and Participants
Kamehameha Schools is a private organization, located in the State of
Hawaii, whose mission is to provide, “educational opportunities in perpetuity to
improve the capability and well-being of people of Hawaiian ancestry” (“About
Kamehameha Schools”). Bernice Pauahi Bishop, a member of Hawaiian royalty,
bequeathed her estate to be used to establish the school. In accordance with the will,
it is the policy of Kamehameha Schools to give admissions preference to students of
Hawaiian ancestry to the extent permitted by law (“About Kamehameha Schools”).
As part of fulfilling its mission to educate persons of Hawaiian ancestry, the
Kamehameha Schools subsidizes a significant portion of the tuition at each of its
schools and provides scholarships for students attending other educational
institutions from preschool to graduate school. Additional aid is also available to
families who demonstrate financial need.
The Community Based Early Childhood Education (CBECE) Division
provides preschool education for three- and four-year-old students in 83 regular and
extended-day classrooms at 30 preschool sites on five islands (Kamehameha
Schools, 2008). Division-wide support is provided through Kamehameha Schools’
Instructional Support Branch (IS) and Planning and Accountability (P&A) Branch.
The division is subdivided into 11 regions, each supervised by an Educational
Coordinator (EC). Some regions also employ an Assistant Educational Coordinator
(AEC). Support staff in each region includes an Outreach Counselor (OC) and an
Instructional Specialist (IS).
71
A May 2009 Staff Survey (Kamehameha Schools, 2009a) indicates that
Kamehameha Schools employs 70 regular day classroom teachers and 70 regular day
teaching assistants (TAs). 16% of teachers and 17% of TAs have worked in early
childhood education (ECE) for 21 years or more. The average years spent teaching is
16.06 years and the average number of years teachers have held their current job is
12.35 years. All lead teachers hold, at minimum, a bachelor’s degree; 43% hold a
bachelor’s degree and 57% have attained a Master’s degree or higher. All TAs are
required to obtain a Child Development Associate (CDA) credential; 33% hold an
associate’s degree, 19% hold a bachelor’s degree, and 3% hold a graduate degree.
The teachers at Kamehameha Schools possess significantly higher levels of
education than found in the general ECE teacher population (Kamehameha Schools,
2009a; Good Beginnings Alliance, 2007).
Positive Behavior Support
In July 0f 2008, the Community Based Early Childhood Education (CBECE)
Division launched its PBS Initiative. PBS is a comprehensive, two-pronged,
cognitive behavioral system designed to support the social emotional development of
young children by (1) building positive relationships and designing supportive
environments; and (2) utilizing social emotional teaching strategies and
individualized interventions to reduce the occurrences of challenging behavior (Fox,
Dunlap, Hemmeter, Joseph, & Strain, 2003). The research literature demonstrates
that social emotional development, particularly the development of prosocial skills,
is a vital building block for future school success (Bowman et al., 2001; Copple &
72
Bredekamp, 2009; Raver, 2002; Raver & Knitzer, 2002; Stormont, Lewis, Beckner,
& Johnson, 2008; Zins, Bloodworth, Weissberg, & Walberg, 2004). Furthermore,
there is an increased risk of school failure and other negative outcomes for young
children who demonstrate challenging behaviors (Technical Assistance Center on
Social Emotional Intervention for Young Children (TACSEI), 2004).
Unfortunately, one in ten children under the age of six will experience
emotional and/or behavioral disorders above and beyond what is developmentally
expected (Kamehameha Schools, 2009b; Hogan, 2003). Meta-analyses reveal that,
depending on the population being studied, 3 – 33% of preschool-aged children
display problem behaviors (Qi & Kaiser, 2003). At Kamehameha Schools, 11% of
the preschool children in the program in SY 2007 – 2008 required additional
supports due to challenging/ problem behaviors (Kamehameha Schools, 2009b).
There is a range of behaviors that teachers may encounter in the preschool
classroom. Examples of challenging behaviors include those that are a health or
safety concern, such as self-injury, intentional destruction of school property or
materials, or intentional physical aggression. Examples of challenging behaviors
that are minor and ongoing include tantrums, non-compliance, or disruptive
behaviors that are not responsive to developmentally appropriate classroom
management approaches.
Based on this data, The CBECE division identified social emotional
development as an important program component to ensure that all KS students
receive a high-quality education and are prepared to enter Kindergarten ready to
73
succeed (Kamehameha Schools, 2009c; Bowman et al., 2001; Copple & Bredekamp,
2009; Hawaii Good Beginnings, Interdepartmental Council, & School Readiness
Task Force, 2006). PBS was chosen by CBECE leadership as a way to address this
issue. PBS has been implemented in a variety of schools and settings across the
nation. It is not a ready-made curriculum or package; PBS is a, “comprehensive
process that uses multiple research-supported practices,” to build effective
preventative and intervention systems that address the unique needs of the particular
school and students (Stormont et al., 2008, p. 25).
To facilitate the implementation of the PBS Initiative at KS, the division
created a Leadership Team (PBS – LT). The PBS-LT is a decision making body
made up of representatives from each region and job category within the division.
Within each region, the EC, AEC (where the position exists), OC, and IS constitute
the PBS Regional Implementation Team (RIT) which meets each week and is
responsible for overseeing PBS implementation at the regional level. The
implementation and design of PBS at Kamehameha Schools shares some common
features with other schools’ PBS programs. However, the process was tailored to
meet the unique needs of the students, families, and staff of KS CBECE.
There are a variety of professional development opportunities provided by
KS; some are mandatory, division-wide activities, others are provided at the regional
level based on the needs of the region or individual staff member. While many of
these regional level professional development activities are designed by the
74
Instructional Branch, they are not mandated trainings. ECs have the authority and
discretion to conduct these on an as needed basis.
Division-wide, mandated professional development efforts in the first year of
implementation provided staff with training on building positive relationships,
creating supportive environments, and the use of social emotional teaching strategies
(Kamehameha Schools, 2009b). Professional development efforts in the second and
third years of implementation focused on the development and implementation of
portions of the CB Process; these efforts focused on refining the Behavioral Incident
Report (BIR) form and training teachers in its usage.
The Challenging Behavior Process
The Challenging Behavior Process is an integral part of the PBS system and
is an important tool for teaching students appropriate prosocial skills and fostering
emotional growth and development. This is the formal process for identifying and
addressing challenging behaviors that arise in the classroom. The process begins
with identifying and documenting challenging behaviors as they occur in the
classroom. The process then lays out a series of steps and actions that can be taken
to address those behaviors with the goal of teaching students positive strategies that
will result in social emotional growth and the amelioration of challenging behaviors.
This process is one aspect of PBS that has been tailored to meet the unique needs of
KS CBECE. CBECE’s CB Process features two distinct pathways for addressing
challenging behaviors in young children. The first pathway, the Intervention Plan,
addresses behaviors that are minor and ongoing. The second pathway, the Individual
75
Support Plan, is more intensive and addresses challenging behaviors that have not
responded to Intervention Plans or challenging behaviors that are serious safety
concerns and extreme. This multiple pathway approach differs from most PBS
systems which utilize a single pathway to address all challenging behaviors.
In addition to addressing student outcomes, the CB Process also impacts
teacher outcomes. A 2009 staff survey indicated that 89% of CBECE respondents
identified addressing challenging behaviors as a training need. A significant number
of respondents also indicated that the occurrence of challenging behaviors negatively
affected their overall job satisfaction; 20% indicated that they were affected a great
deal and 53% reported being somewhat affected.
Finally, the CB Process also helps the division to fulfill its responsibilities to
students and families in compliance with recent amendments to the Americans with
Disabilities Act (ADA) of 1990 (2009). Under ADA preschools are required to
make reasonable modifications to policies and practices in order to integrate children
and families with disabilities into the program. Programs must do so unless these
changes represent fundamental alterations to the program. The CB Process is one
method that CBECE uses to determine and implement reasonable program
modifications for children whose challenging behaviors stem from a diagnosed
disability.
The first three years of implementation were also used to create and refine the
Challenging Behavior Process. During the first three years of implementation staff
did not receive formal training about this process; the process itself was being
76
created and refined as it was being used. Once the Challenging Behavior process
was finalized, a formal professional development training was created. The stated
goals of the CB Process training are to familiarize staff with the following aspects of
the Challenging Behavior Process: (1) documentation, (2) support planning, (3) staff
roles, (4) purpose of meetings, and (5) time frames (See Appendix C).
The CB Process training was chosen as the focus for this study for a number
of reasons. First, implementation of PBS has been a division focus and priority for
the past three years. The CB Process is an essential feature of this system and also
has impacts on teacher job satisfaction and ADA compliance. Therefore it is
important that teachers receive effective and comprehensive training on
implementing this process. Second, unlike other professional development activities,
the CB Process training was planned far in advance of the school year, materials and
informants are readily accessible, and the training is provided across the division.
These factors ensured an adequate sample size and the opportunity to collect data
within the targeted time frame of the dissertation process.
CB Process training design. As mentioned earlier, the CB Process at
Kamehameha is unique to the program and has added complexity in comparison to
other programs. Information about the design and implementation of the CB Process
training was obtained through a semi-structured interview of the Director of
Instructional Support (personal communication, October 26, 2011).
According to the Director a division-wide CB Process training was not a part
of the original implementation plan. Other programs that use PBS provide a general
77
training about strategies used in the CB Process (titled Module 3a & 3b) to staff
members and intensive training to the support staff that are responsible for
implementing the CB Process. KS followed this model and initially trained only the
RIT members in the CB Process. However, staff surveys and feedback indicated that
staff members needed and/or wanted more information about the CB Process and
how it related to their classrooms and students. The current CB Process training was
designed in response to staff demands. It is part of a five-year implementation plan
that includes on-going training to address various facets of the PBS initiative.
The CB Process training was created by a PBS Leadership Team workgroup.
This work group was assisted by a member of the Instructional Branch, who is an
experienced trainer and “steeped in training and professional development” (personal
communication, Director of Instructional Support, October 26, 2011). The training
was then vetted and edited by various stakeholders and rolled out by the Educational
Coordinators and RIT in each region. Given the background and composition of the
team that created the training, it was assumed that learning theory and specifically
the needs of the CBECE staff were considered (personal communication, Director of
Instructional Support, October 26, 2011).
The CB Process training itself was a one-time, mandatory, workshop-type
training. Educational Coordinators (ECs) in each region were tasked with
implementing the CB Process training during the first six weeks of the 2011 – 2012
school year. All teaching and support staff were required to undergo this training.
78
The format for the training was a scripted powerpoint presentation that was 1
– 2 hours in length. Fidelity was assumed due to the scripted nature of the
presentation. An examination of the training PowerPoint (See Appendix C) revealed
that the majority of the training was “teacher-directed”, that is information was
presented to staff members. However, the training ended with a few collaborative
activities that asked staff to respond to a variety of scenarios based on the
information taught in the training. These scenarios mainly addressed the early stages
of the CB Process; specifically, when and how to fill out a Behavioral Incident
Report (BIR), and the next step after the completion of the BIR.
Staff were provided with resource materials that were utilized during the
training. The materials provided included access to the CB Process manual, a
handout that outlined the immediate Support Guidelines, the CB process flowchart,
blank Behavioral Incident Report forms, and access to the powerpoint presentation
(either in digital or hard copy format). Again, the exact resources provided to staff
were scripted into the PowerPoint presentation (See Appendix C).
Although the training was mandatory, follow-up or support activities were
not a divisional requirement. Follow-up training and/or support was left to the
discretion of the Educational Coordinators and RIT in each region. RITs may
choose to provide additional training or support as necessary to meet the specific
needs of the region or staff members.
79
Instrumentation
The current study utilized a series of surveys, an interview, and document
analysis to gather data that were used to evaluate the effectiveness of the CB Process
training. This section will provide detailed information on the instruments used in
this study.
Surveys of teaching staff. A brief 15 minute pre-test survey (See Appendix
D) was administered to all teaching staff at the start of the school year prior to the
CB Process training. A brief 15 minute post-test survey (See Appendix E) was
administered to all teaching staff immediately following the CB Process training.
Pre- and Post- Training survey items covered four areas: (1) teacher demographics;
(2) teachers’ response to the training (Post-test only); (3) teacher efficacy; and (4)
teacher knowledge about the CB Process.
Follow-up surveys (See Appendix F) were administered to teachers who
participated in the CB Process during the first semester of the school year (August
2011 – December 2011). Participation was defined as having a student whose
challenging behaviors required the creation of an Intervention or Individual Support
Plan. Follow-up survey questions covered four areas: (1) teacher response to the
training, (2) self-reported success in implementing the CB Process, (3) the types and
effectiveness of support available to teachers, and (4) teacher efficacy. The follow-
up survey provided information about specific types of support and follow-up
available to teachers at the regional level. This information was used to
contextualize study results.
80
The following sections provide detailed information about the questions and
scales used in the three surveys (See Appendices D, E, & F).
Teacher Sense of Efficacy Scale (TSES). Teacher efficacy questions were
taken from the 12 item, Teacher Sense of Efficacy Scale – Short Form (TSES;
Tschannen-Moran & Woolfolk Hoy, 2001). The TSES uses a 9-point Likert scale
and measures teacher efficacy in the following areas: (1) Efficacy in Instructional
Strategies; (2) Efficacy in Student Engagement; and (3) Efficacy in Classroom
Management. Unlike previous measures of teacher efficacy, the TSES has a unified
and stable factor structure and also assesses a broad range of teaching tasks and
capabilities. The TSES avoids being overly specific therefore preserving its ability
to be used to compare teachers across a variety of contexts, grade levels, and
subjects.
Responses to each question were determined using the following anchors: 1
= Not at All, 3 = Very Little, 5 = Some degree, 7 = Quite a Bit, and 9 = A Great
Deal. To determine Efficacy in Classroom Management, Efficacy in Instructional
Strategies, and Efficacy in Student Engagement subscale scores the unweighted
means of items that load onto each factor were computed. These groupings are
(Tschannen-Moran & Woolfolk Hoy, 2001):
Efficacy in Student Engagement: Items 2, 4, 7, 11
Efficacy in Instructional Strategies: Items 5, 9, 10, 12
Efficacy in Classroom Management: Items 1, 3, 6, 8
81
Tschannen-Moran & Woolfolk Hoy (2001) found that the TSES
demonstrated strong reliability. Analysis of the TSES data from the current study
demonstrated that the scale and its subscales were similarly reliable with this sample.
The original reliability findings and the reliabilities from the current study are
presented in Table 2.
Table 2
TSES Short Form Reliability
Mean SD alpha
Scale Original Current Original Current Original Current
TSES 7.1 7.5 .98 .91 .90 .96
Engagement 7.2 7.5 1.2 .96 .81 .88
Instruction 7.3 7.4 1.2 .92 .86 .85
Management 6.7 7.6 1.2 .96 .86 .89
Note. Partially adapted from “Teacher efficacy: capturing an elusive construct,” by M. Tschannen-
Moran & A. Woolfolk Hoy, 2001, Teaching and Teacher Education, 17(7), p. 800. Copyright 2001
by Elsevier Science Ltd.
CB Process knowledge questions. Ten knowledge questions were created to
measure what participants learned during the CB Process training (See Appendix D,
E, & F). These knowledge questions were constructed based on the key points
covered in the scripted PowerPoint presentation. The knowledge questions were
submitted for review and comment to the Director of Instructional Support and the
82
five ECs participating in the study. All questions were multiple choice; seven
questions addressed factual knowledge about different parts of the process and three
questions asked participants to apply their knowledge of the CB process to common
scenarios. The scenarios used in this study were unique and had not been used in
this or any other PBS training. For initial analyses, answers were coded “correct” or
“incorrect”. A composite knowledge score was computed by summing the number
of correct answers. Participant answers to individual questions were analyzed
separately to determine common errors.
Teacher Activity Survey. Items from Sections Three and Four of the Teacher
Activity Survey (TAS; Garet, Birman, Porter, Desimone, & Herman, 1999) were
adapted for use in this study. Survey questions asked teachers to describe the
professional development activity and to rate its coherence with teacher and
institutional goals, as well as, other professional goals, standards, and assessments.
One item asks teachers to rate their success at implementing the skills taught during
the professional development activity. This item will be used as a self-report
measure of Kirkpatrick’s (2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007) third
level, transfer. Teachers were also surveyed to determine if any issues arose that
hindered the implementation of skills learned from the activity.
Interview. The purpose of the interview was to meet with the Director of
Instructional Support to establish the administrative rationale for the CB Process
training, the theory of learning that informed the construction of the activity, the
structure of the activity, and the plan for implementing the training.
83
The researcher used an Interview Protocol tool that included questions
encompassing three key themes: (a) Rationale for CB Process training; (b)
Underlying theory(s) of learning; and (c) Structure and implementation of CB
Process training (See Appendix B). Within these themes, a series of open-ended
sub-questions was included at the time of the interview. Interview responses were
used to supplement and inform the conclusions drawn from the quantitative analyses.
Document analysis. Qualitative document analysis was used to review the
PowerPoint presentation used in the training (See Appendix C). The PowerPoint
was analyzed using andragogical learner analysis (Knowles et al., 2005) and the six
features of effective professional development (Garet et al., 2001) discussed earlier.
This analysis provided information about the content and format of the training being
studied and informed the conclusions drawn from the quantitative analyses.
Data Collection
Data collected for this study consisted of three surveys and one interview.
Survey responses could not be linked back to individuals. All respondents created a
unique identification number, using year of birth and last four digits of their phone
numbers, to allow pre-, post-, and follow-up survey responses to be linked. The
following sections detail the data collection procedures.
Pre-test survey administration. Regions 4 and 5 scheduled the mandatory
CB Process training on the first teacher work day of the school year. Because most
teachers do not access their school emails during the summer break, teachers in these
regions were asked to fill out paper and pencil versions of the pre-test survey.
84
Regions 1, 2, and 3 scheduled the CB Process training later in the school year, in
these regions pre-test surveys were distributed to all teaching staff using the Survey
Monkey Online Survey tool on the first teacher work day of the 2011-2012 school
year. A link to the survey was emailed to all teaching staff in the target regions. The
survey was available online until the CB Process training was presented; email
reminders were sent to staff every week until the survey closed. There were 105
staff members from five regions who received the survey request.
Although the first few weeks of the school year are busy, CBECE sets aside
the first day of work specifically for staff meetings, short training efforts, and teacher
preparation. Furthermore, the first few weeks of the year represent the student
orientation period. During this period students attend school for part of the day and
the rest of the school day is devoted to teacher preparation activities. This study
utilized this time period to take advantage of the extra time teachers have to
participate in training activities.
Post-test survey administration. A Post-Test survey link was distributed
via email to teaching staff using the Survey Monkey Online Survey Tool
immediately after the completion of the CB Process training. The survey was
available online for four weeks; email reminders were sent to staff every week until
the survey closed. There were 105 staff members from five regions who received the
survey request.
Follow-up survey administration. Data from SY 2007 – 2008 showed that
11% of students in the CBECE programs required additional supports due to
85
challenging behaviors (Kamehameha Schools, 2009b). The PBS literature indicates
that even with the universal supports provided by PBS, 5% to 7% of students may
require the intensive, individualized supports provided through the CB Process
(Stormont et al., 2008). Based on these data, it was expected that 5% - 11% of
CBECE students may have required the supports provided through the CB Process.
The teachers who guided these students and their families through the process were
the focus of the Follow-up survey. At the beginning of December 2011, Educational
Coordinators provided email contact information for teaching staff who participated
in the CB Process, defined as participating in the creation and implementation of an
Intervention or Individual Support Plan, from August 1, 2011 to November, 30,
2011. Follow-up survey links were distributed via email to teaching staff using the
Survey Monkey Online Survey tool. The surveys remained available for three work
weeks; email reminders were sent to staff every week until the survey closed.
Qualitative interview. The interview of the Director of the Instructional
Branch was conducted on October 26, 2011. The interview was recorded and
subsequently transcribed for analysis. The interview findings were used to conduct
the andragogical learner analysis (Knowles et al., 2005) and to assess the degree to
shich the CB Process training fit with Garet et al.’s (1999) features of effective
professional development.
Limitations
First, the specificity of the CB Process limits the generalizability of study
findings beyond the context of KS CBECE. PBS is a system that is designed to be
86
flexibly implemented in many different school contexts. The CB Process used at KS
is based on PBS principles, but was designed by the organization to meet the specific
needs of the KS CBECE program.
Second, because the CB process training was a mandatory training for all
staff, a pre/post design was used. The lack of a random assignment or a control
group limits the conclusions that can be drawn; the study can only describe
relationships between the training and the outcomes measured, not causality.
Third, the scope of the study necessitated that only five of the eleven CBECE
regions were studied. This limited size of the sample and may have limited the
ability of the researcher to uncover significant differences in pre-, post-, and follow-
up scores.
Fourth, the scope of the study and time constraints did not allow for direct
observation of the implementation of the CB Process or for access to student
outcome data that are collected at the end of the school year. Therefore, the current
study only examined self-reported teacher perceptions about implementation and did
not address student outcomes.
Because of these limitations, the current study primarily addressed levels one
and two of the Kirkpatrick (2001, 2006; D. L. Kirkpatrick & Kirkpatrick, 2007)
framework (reactions and learning). Teacher self –report was used to measure level
three (transfer), providing limited insight into the transfer of learning to practice.
The research literature emphasizes the need for studies that go beyond measuring
only reaction; this study provides valuable information to the program on
87
participants’ learning and transfer. Prior to this study, the program had not collected
data to specifically measure the effectiveness of its professional development
activities. This study represents an important step forward in the organization’s
evaluation efforts and will provide much needed information that can inform future
professional development efforts.
Finally, recognizing the need for research on the effectiveness of professional
development training, the researcher initiated the proposed study with the
understanding that she would guard against personal bias as a result of knowing and
being employed by the developers/ facilitators/trainers of the professional
development program. The study was supported by the administration, staff, and
faculty of the preschools in the study with the understanding that the research was
intended to examine changes in teacher efficacy, knowledge, and practice related to
participation in the professional development training.
Data Analysis
The sample used for this study consisted of the teachers, teaching assistants,
and extended day staff from the five CBECE regions on the island of Oahu who
participated in the Challenging Behavior Process training. Although 105 pre-and
post-test surveys were administered, only 24 staff members responded to both the
pre- and post-test surveys. The current study focused on the 24 respondents who
completed both a pre-test and post-test survey; these respondents are identified as the
matched sample. Fourteen teaching staff were identified by their ECs as having
participated in the CB process. Of these fourteen teachers, only seven completed the
88
Follow-up survey. The Follow-up sample was comprised of these seven
respondents.
Sample Demographics
Descriptive statistics were used to analyze the demographic characteristics of
the teachers sampled in this study. Variables of interest were: (1) job title; (2)
region; (3) years in current position; (4) years in ECE; and (5) highest level of
education. Frequencies, means, and standard deviations for teacher demographic
variables will be reported in Chapter Four.
Chi square analyses were used to determine whether the matched sample
differed significantly from the population (i.e., all teachers who filled out at least one
survey) on job title, region, and highest level of education. Independent samples t-
tests were used to determine whether the matched sample differed significantly from
the population (i.e., all teachers who filled out at least one survey) on Years Working
in ECE, Years in Current Position, Pre-test self-efficacy scores, and Pre-test
knowledge scores. The results of these analyses are presented in Chapter Four.
Quantitative Analysis
Analysis on the pre-, post-, and follow up- surveys were completed to answer
each research question. Table 3 provides an overview of how each research question
is related to survey items and/or documents, quantitative or qualitative analyses that
were conducted, and the Kirkpatrick (2001, 2006; D. L. Kirkpatrick & Kirkpatrick,
2007) Framework level measured.
89
Table 3
Research Questions, Survey Items, and Statistical Analyses
Research Question Survey and Item Statistical Analysis
Kirkpatrick
Framework
Was the training designed
to effectively meet the
needs of teachers
according to andragogical
theory and the criteria for
effective professional
development?
What were
participants’ reactions
to the CB Process
training?
Post-test: 3:2 Descriptive Statistics
• Mean
• Standard Deviation
• Percent Agree/
Strongly Agree
Qualitative Analysis
Reaction
What is the relationship
between participation in
the CB Process training
and participants’
perceived self-efficacy?
Pre-test: 4:1
Post-test: 4:1
Follow-up: 6:1
test
Paired Samples T-test Learning
What is the relationship
between participation in
the CB Process training
and participants’
understanding of the CB
Process?
Is there a relationship
between participants’
reactions and
participants’
understanding of the
CB Process?
Pre-test: 5:1-8
6:1-3
Post-test: 3:2
5:1-8
6:1-3
RQ3: Paired
Samples T-test
RQ3a: Pearson R
Correlation
Learning
Are participants able to
successfully implement
the objectives taught in
the Challenging Behavior
Process training?
Follow-up: 4:1
test
Descriptive Statistics
• Mean
• Standard Deviation
• Percent Agree/
Strongly Agree
Transfer
90
For Research Question One, descriptive statistics were used to analyze the
frequency of follow-up support, types of support offered, and coherence of the
training with other professional development, teacher and organization goals, and
standards and assessments. This data were used to complete an andragogical learner
analysis (Knowles et al., 2005) and to assess the degree to which the training
exhibited Garet et al.’s (2001) features of effective PD. For Research Question 1a,
means and standard deviations for the five items related to participants’ reactions
were calculated, as well as, the percentage of participants who indicated that they
agreed or strongly agreed with each of the reaction statements.
For Research Question Two, Pre- and Post-test TSES full scale and subscale
scores were calculated. Paired sample t-tests were used to determine if there was a
statistically significant difference in self-efficacy scores from pre- to post-test.
For Research Question Three, Pre- and Post-test knowledge composite scores
were calculated. A paired sample t-test was used to determine if there was a
statistically significant difference in composite knowledge scores from pre- to post-
test. For Research Question 3a, the change in knowledge score from pre- to post-test
was calculated. This change score was then correlated with the Reaction Item scores
using a Pearson correlation.
For Research Question Four, the mean and standard deviation for the
Transfer Item was calculated, as well as, the percentage of participants who indicated
that they agreed or strongly agreed with the item.
91
Coding of Qualitative Responses and Expansion of Quantitative Data
Qualitative analysis was used to code and analyze data from the interview,
document analysis, and open-ended survey responses to answer Research Question
One and to provide contextualization for the other findings. The researcher reviewed
the documents to complete the andragogical learner analysis (See Appendix A;
Knowles et al., 2005) and an assessment of the six features of effective professional
development (Garet et al., 2001) discussed earlier. The results of the qualitative
analyses were juxtaposed with the quantitative results in order to give a more
complete answer to the research questions.
Assumptions
The researcher has made the following assumptions:
1. The measures utilized in this study are reliable, valid indicators of
constructs.
2. The measures have been accurately recorded and analyzed.
3. All subjects responded honestly and openly to the survey and/or the
interview questions.
4. The CBECE division will welcome and use the information discovered in
this study to inform future professional development efforts.
The findings of the study are presented in the next chapter. First, the sample
demographics that inform the findings are presented. Next, the research findings,
organized by research question, are presented. The chapter concludes with a
summary of findings.
92
CHAPTER FOUR
RESULTS
The purpose of this study was to evaluate the effectiveness of the CB Process
training by examining the relationship between participation in the training and
participants reactions, self-efficacy, learning, and knowledge transfer, The
researcher also explored whether the training was designed and implemented to meet
the needs of adult learners. The study utilized a mixed methods approach to examine
the relationship between participation in the CB Process training and Levels one,
two, and three of the Kirkpatrick framework: reactions, learning, and transfer.
This chapter represents the results of the analysis of the data. Participant
characteristics including job type, years of experience, region, and education level
are presented in the first section of this chapter. The next section will present the
results of the data analysis as they pertain to each of the four research questions.
Discussion of these results will be presented in Chapter Five.
Descriptive Results
As noted in Chapter Three, 105 pre-test and 105 post-test surveys were
administered to teaching staff either online or in-person across five regions on Oahu.
The current study focused on the 24 respondents who completed both a pre-test and
post-test survey. These respondents represented 22.9% of the total staff surveyed
and are referred to as the matched sample. The response rates to both the pre- and
post-test surveys are presented in Table 4. In addition to the matched sample,
fourteen teaching staff were identified by their ECs as having participated in the CB
93
Process. Of these fourteen teachers, seven completed the Follow-up survey. The
Follow-up sample was comprised of these seven respondents.
Table 4
Response Rates
Survey Type Response Rate (N = 105)
Pre-test Survey 63.8%
Post-Test Survey 34.3%
Matched Pre- /Post- Survey 22.9%
Matched Sample Demographics
Demographic characteristics of the pre-test sample, the matched sample, and
the Follow-up sample are presented in Table 5. As shown in Table 5, the matched
sample represented staff from all five of the target regions and all three target job
categories. Respondents also reported a range of educational backgrounds as
illustrated in Table 5.
94
Table 5
Sample Demographics
Type
Pre-Test
(N = 66)
Matched
(N = 24)
Follow-up
(N = 7)
Region
Region 1 15 7 2
Region 2 9 4 0
Region 3 10 2 0
Region 4 12 5 2
Region 5 20 6 3
Position
Teacher 30 12 2
Teaching Assistant 26 10 5
Extended Day Staff 10 2 0
Education Level
Some college 3 1 0
CDA 18 5 3
AA/AS 13 5 1
Four Year 11 4 0
Masters 21 9 3
95
Matched sample respondents’ years of experience in their current jobs ranged
from 1 – 29 with a mean of 11.39 years (SD = 8.659). Similarly, the range of
experience working in Early Childhood Education ranged from 3 – 28 years with a
mean of 15.42 years (SD=7.512). More than half of the respondents in the matched
sample, 60.9%, had been employed in their current position for more than ten years
and half of all matched sample respondents had worked for 20 or more years in the
field of Early Childhood Education.
Chi-Square analyses indicated that the matched sample is representative of
the Pre-test participants in that it did not differ significantly from total population on
Job Type or Region. Furthermore, the matched sample did not differ from the rest of
the respondents on Education Level. Independent Samples T-tests indicated that the
24 respondents also did not differ significantly from the rest of the respondents on
Years in Current Position or Years Working in ECE. Because of the small sample
sizes, data were not disaggregated by any of these variables.
Follow-Up Survey Response Rate and Demographics
As mentioned in earlier in this chapter and in Chapter Three, teaching staff
who utilized the CB Process from August to December 2011 were identified to
participate in the Follow-up survey. Seven classrooms in Regions 1, 4, & 5 had
students who required Intervention or Individual Support Plans during this time
period. Regions 2 & 3 did not have any students requiring Intervention or Individual
Support Plans between August and December 2011. Fourteen survey requests were
sent out to the teachers and teaching assistants working with those students. There
96
were no extended day staff that participated in the CB Process. Seven staff members
responded to the survey and this resulted in a response rate of 50%. The
demographic characteristics of the Follow-up sample are illustrated in Table 5. Five
respondents were teaching assistants and two were teachers. Chi Square analyses
indicate that the Follow-up sample was not significantly different from the
population on Job Type, Region, or Level of Education.
As mentioned in Chapter Three the researcher anticipated 5% to 11% of
students in CBECE to require the supports provided by the CB Process. There are
752 students in Regions 1 -5. The seven students who participated in the CB Process
represent less than 1% of the population. Due to low referral rate of students into the
CB Process and the low response rate of teachers the Follow-up survey small
sample size was very small. This small sample size limited the quantitative analysis
of follow-up survey data.
Results
The research questions that guided this investigation were:
1. Was the training designed to effectively meet the needs of teachers
according to andragogical theory and the criteria for effective
professional development?
a. Kirkpatrick Level 1 (Reaction): What were participants’ reactions
to the CB Process training?
97
2. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ perceived self-
efficacy?
3. Kirkpatrick Level 2 (Learning): What is the relationship between
participation in the CB Process training and participants’ understanding
of the CB Process?
a. Is there a relationship between participants’ evaluation of the CB
Process training and participants’ understanding of the CB
Process?
4. Kirkpatrick Level 3 (Transfer): Do participants report being able to
successfully implement the objectives taught in the CB Process training?
The following sections will present the results of analyses for the above research
questions.
Findings Related to Research Question One
Research Question One is, “Was the training designed to effectively meet the
needs of teachers according to andragogical theory and the criteria for effective
professional development?” The aim of the question was to determine if the training
met the six criteria for effective professional development (Garet et al., 2001) and
was designed to meet the needs of adult learners (Knowles et al., 2005). Both
quantitative and qualitative methods were used to assess this research question.
Information about the training obtained through the interview of the Director of
Instructional Support and from the training materials was discussed in Chapter
98
Three. This section will present quantitative data from the Post-test and Follow-up
survey and qualitative survey results. This data and the information presented in
Chapter Three will then be used to conduct an andragogical learner analysis
(Knowles et al., 2005) and to determine if the training met the six criteria for
effective professional development (Garet et al., 2001).
Quantitative findings. The Post-test and Follow-up survey items related to
this research question ask participants to rate the coherence of the training to their
professional development and practice, the types of support participants felt were
necessary for implementation, and the types of support that were actually received
during implementation.
Consistency. Post-test Section 3, Item 1 consists of five statements that
asked participants to rate the extent to which the CB Process training was consistent
with their own goals and organization goals, with their earlier learning, and with the
standards and assessments of the profession. Participants responded to each item
using a 4-point Likert scale with the following anchors: 1 = Not at all; 2 = To some
degree; 3 = Quite a bit; 4 = Completely. Table 6 presents the mean and standard
deviation for each question on the Post-test, as well as, the percentage of participants
who responded “Quite a bit” or “Completely” on both the Post-test and Follow-up
surveys.
The results of the descriptive analyses indicate that most teachers at the Post-
test felt that the CB Process training was consistent with their own goals for
professional development (68%), with KS’s plans to change practice (84%), with
99
previous professional development (72%), with preschool standards (80%), and with
KS assessments (84%). Although the percentage of teachers who agreed with these
statements was lower at Follow-up for four of the five items, most teachers still
agreed with these statements (68%). The results suggest that teachers believe the CB
Process training to be part of a coherent professional development plan.
Table 6
Participant Evaluation of Coherence
Post-test Follow-up
To what extent was the professional development
activity: M(SD)
%
Agree
%
Agree
Consistent with your own goals for PD? 2.92 (.76) 68 68
Consistent with organization plan to change practice? 3.24 (.72) 84 68
Based on what you learned in earlier PD? 2.96 (.74) 72 83
Designed to support preschool standards? 3.00 (.65) 80 68
Designed to support KS Assessments? 3.16 (.80) 84 68
Follow-up support. Post-test Section 3, item 4 asked teachers to indicate
what types of support were needed to apply what was learned in the classroom. The
types of support needed and frequency of participant responses are shown in Table 7.
100
Table 7
Type and Frequency of Needed Support
Type of Support Needed Frequency
More time to learn 32%
Time to practice 24%
Coaching 56%
Collaboration with peers 64%
More planning 52%
More research based evidence 20%
The supports that were most commonly chosen by teaching staff were coaching
(56%), collaboration with peers (64%), and additional planning time (52%).
A majority of staff members (84%) expect to receive some type of follow-up
support. The type of support expected and frequency of participant responses is
shown in Table 8. The type of support that was most expected was follow up or
coaching from the IS (68%).
101
Table 8
Type of Support Expected
Type of Support Expected
Frequency
(N = 24)
Nothing 16%
Follow up/ Coaching by Trainers 24%
Follow up/ Coaching by Instructional Specialist 68%
Follow up/ Coaching by Outreach Counselor 32%
Follow up/ Coaching by Peers 24%
Follow up/ Coaching by Administrators 36%
Qualitative survey findings. There were relatively few responses to the
open-ended questions on the Post-test and Follow-up surveys. Only three
participants answered the open-ended Post-test survey questions. These three
responses indicated that they did not find the training to be relevant and/or useful.
These three answers were contradicted by the quantitative data. On the Follow-up
Survey, two themes emerged. These themes are presented in Table 9. First,
participants indicated that coaching by colleagues and/or support staff was helpful
during the CB Process. The second theme was that the CB Process was not effective
in addressing the challenging behaviors displayed by their student(s). The first
theme is supported by the data that indicate that teachers feel coaching is necessary
for successful implementation and that they expect this type of support to occur.
102
However, the second theme indicates that the CB process may not be as successful as
indicated in the teacher self-report data.
Table 9
Open Ended Responses to Follow-up Surveys
Theme Response
Coaching was helpful
(Materials provided as
part of coaching)
“meeting with the team EC/IS/OC”
“offering a variety of strategies to try in the classroom”
“IS and OC providing learning materials for child on
plan.”
“Ideas and feedback”
“The information and knowledge that was provided by
the support team, it was from experience they spoke
from”
“I've had discussions with the teacher about this child’s
behavior.”
CB Process does not
result in behavior
changes
“The student in question has not complied with any of
the training that T&TA have gone through.”
“Not all students respond well to the Challenging
Behavior process.”
“NO change in behavior of child”
103
Andragogical learner analysis. The andragogical learner analysis (Knowles
et al., 2005) discussed in Chapter Three was conducted using the information gained
from the interview of the Director of Instructional Support, the review of training
materials, and the demographic data. The results of the analysis are presented in
Table 10.
The andragogical analysis indicated that the teachers who participated in the
study fit the core assumptions of the andragogical model. There were two important
factors that stood out in this analysis. The first was that the teachers in this sample
appear to be highly self-directed. Second, 50% of the teaching staff have 20 or more
years of experience in ECE. The implications of these results are discussed in
Chapter Five.
104
Table 10
Andragogical Learner Analysis
Applies?
Expected Influence of
Individual and Situational Differences Goals and Purposes for Learning
Andragogical Principle Subject Matter Individual Learner Situational Individual Institutional Societal
Adults need to know
why they need to learn
something
CB Process is
an essential part
of PBS
Improve teacher
efficacy.
Enhance
understanding of
the CB Process
Improve
implementation
of the CB
Process
Strong
social-
emotional
skills
correlated
with school
success
The self-concept of
adults is heavily
dependent upon a move
toward self- direction
New and
complex
subject matter
High initial levels
of efficacy.
Coaching &
Collaboration
highly valued
Need for
consistency
across CBECE
requires a scripted
training approach
Prior experiences of the
learner provide a rich
resource for learning
Three years of
prior PBS
training
High educational
attainment. Many
years of experience
105
Table 10, continued
Applies?
Expected Influence of
Individual and Situational Differences Goals and Purposes for Learning
Andragogical Principle Subject Matter Individual Learner Situational Individual Institutional Societal
Adults typically become
ready to learn when
they experience a need
to cope with a life
situation or perform a
task
All teachers will
eventually take part
in the CB Process;
some already have
experience
Teachers requested
CB Process
training
Need for
program is
readily apparent
in their everyday
jobs
Adults orientation to
learning is life /
problem centered
Most teachers have
encountered
challenging behavior
The motivation for adult
learners is internal
rather than external.
Stage theory: <20
years will evidence
high interest in this
topic
Stage theory: 20+
years may have
decreased
engagement
Note. Data used for the analysis were taken from the interview of the Director of Instructional Support, the review of training materials, and the
demographic data obtained from the matched sample. Adapted from “The adult learner: The definitive classic in adult education and human
resource development,” by M.S. Knowles, E.F. Holton, and R.A. Swanson, 2005, loc. 1761.
106
Effective professional development. Table 11 shows the findings of
Research Question One as they related to the features of effective professional
development presented by Garet et al. (2001). The data from the interview of the
Director of Instructional Support, the review of training materials, and the Post-test
and Follow-up survey data discussed earlier were used to determine if each feature of
effective professional development was present or not present. Table 11 summarizes
the data. As illustrated in Table 11, the training possessed five of the six features of
effective professional development. Because the CB process training focused on
training teachers to enact a process rather than on enhancing their teaching in a
particular subject area, it did not possess the sixth feature. The implications of this
finding are discussed in Chapter Five.
107
Table 11
Features of Effective Professional Development
Feature Result CB Process training
Content
knowledge
Not Present • Focus on teaching strategies and process
not on subject matter content
Active learning Present • Small Group Scenario Activity
• Collaboration and Coaching
Coherence Present • 68% + teachers agree or strongly agree
with survey items measuring coherence
Form of the
activity
Workshop
On-going
• Actual training: Workshop Format
• Follow-up Supports included Coaching
and Collaboration
Collective
participation
Present • Small Group Scenario Activity
• Coaching and Collaboration
Duration Present • CB Process training: Short Duration
• PBS Training: Five Year Training Plan
• Coaching and Collaboration
Summary of findings for Research Question One. Participants believe the
CB Process training to be a part of coherent professional development plan that fits
with the goals and standards of teachers, KS, and the profession in general. Many
participants indicated a need for follow-up support from colleagues and coaches and
also indicated that this type of support was expected. When combined with the data
gathered from the qualitative interview and analyses of the training materials, these
108
findings suggest that the CB process training was designed and implemented to meet
the needs of adult learners and to be an effective professional development activity.
Findings Related to Research Question 1a
Research Question 1a asked, “What were participants’ reactions to the CB
Process training?” This question focused on the first level of the Kirkpatrick (2001,
2006; D. L. Kirkpatrick & Kirkpatrick, 2007) Framework, Reaction.
Post-Test Section 3, Item 1 contained six items that were used to measure
participants’ immediate reactions to the CB Process training. Participants responded
to each item using a 4-point Likert scale with the following anchors: 1 = Strongly
Disagree; 2 = Disagree; 3 = Agree; 4 = Strongly Agree. Table 12 presents the means
and standard deviations for each item, as well as the percentage of participants who
indicated that they agreed or strongly agreed with the statement.
Table 12
Participant Evaluation of Professional Development Experience
Question M(SD) % Agree (N = 24)
Relevant to my professional practice 3.24 (.44) 100
Quality of information was sound 3.28 (.46) 100
Information conveyed effectively 3.16 (.47) 96
Likely to implement the practices taught 3.24 (.44) 100
Students will likely benefit 3.24 (.52) 96
Confident in my ability to implement 3.20 (.50) 96
109
The majority of participants agreed or strongly agreed with all six reaction
items. These results indicate that participants felt the training was relevant, the
information was of good quality, was conveyed effectively, and that students would
likely benefit from the practices taught. Furthermore, participants indicated that they
were both likely to implement these practices and confident in their ability to do so.
Findings Related to Research Question Two
Research Question Two examined the relationship between participation in
the CB Process training and teachers’ self-efficacy for teaching. The aim of
Question 2 was to assess Kirkpatrick level two, Learning; more specifically to assess
whether participation in the CB Process training was related to a change in attitude.
Teacher efficacy was measured in the following areas: (1) Overall Efficacy; (2)
Efficacy in Instructional Strategies; (3) Efficacy in Student Engagement; and (4)
Efficacy in Classroom Management. Responses to each question were determined
using the following anchors: 1 = Not at All; 3 = Very Little; 5 = Some degree; 7 =
Quite a Bit; and 9 = A Great Deal. Descriptive analyses were performed to examine
participants’ overall Pre-test and Post-test efficacy levels. Mean scores indicated
that teachers reported high levels of efficacy overall and on all subscales on both the
pre- and post-tests; the means for total score and all subscales were higher than seven
(Quite a Bit). The Pre- and Post-test means and standard deviations are presented in
Table 13.
110
Table 13
Mean Pre- and Post-test Scores for TSES
Pre-Test Post-Test
Scale M SD M SD
Total 7.34 .90 7.29 .87
Instructional Strategies 7.16 .85 7.17 .90
Student Engagement 7.44 .97 7.41 1.03
Classroom Management 7.42 1.00 7.29 .91
Next, participants’ Pre- and Post-test efficacy scores were compared. Paired
samples t-tests revealed no statistically significant differences in total TSES scores.
Furthermore, there was no statistically significant difference in pre- and post-test
scores on Instructional Strategies; Student Engagement; or Classroom Management.
These results indicate that there is no relationship between participation in the CB
process training and teachers’ efficacy. However, t-tests also indicate a statistically
non-significant, but consistent negative relationship between pre- and post-test
ratings of both total efficacy and the three subscales; teachers efficacy scores were
slightly lower after receiving training than they were before.
Paired samples t-tests conducted on Pre-test and Follow-up TSES scores
uncovered a statistically significant negative difference in the Instructional Strategies
subscale [t(5) = -3.124, p = .026]. Teachers’ self-efficacy regarding the use of varied
111
teaching strategies was lower after their experience implementing the CB Process.
This is particularly significant giving the small sample size. There were no
statistically significant findings for the total score or the other subscales.
Findings Related to Research Question Three
Research Question Three examined the relationship between participation in
the CB Process training and participants’ understanding of the CB Process. The aim
of Question Three was to assess Kirkpatrick Level 2, Learning; more specifically, to
assess whether participation in the CB Process training was related to a change in
knowledge about the CB Process. For each of the ten knowledge-based, multiple-
choice questions, Participants’ answers were coded “correct” or “incorrect” and the
total number of correct answers was calculated. Participants’ mean pre-test score
was M = 4.62, SD = 1.50, and mean post-test score was M = 5.71, SD = 1.52. A
paired samples T-test indicated the difference between pre- and post-test knowledge
scores was statistically significant [t(20) = 2.16, p = .04]. The results indicated an
overall rise in participants’ knowledge about the CB process following the training.
Item analysis. Table 14 shows the number of participants who answered
each question correctly on the Pre- and Post-test; the change from Pre- to Post-test is
also noted. Item analysis produced several important findings. The first was that
teachers showed large gains on two test items in particular. The questions (questions
8 & 9) that asked teachers’ to apply their understanding of the CB Process to
possible classroom scenarios. The second finding was that the majority of
participants incorrectly answered three of the ten questions on both pre- and post-
112
tests. The three questions (questions 2, 4, & 5) were factual questions about various
parts of the CB Process.
Table 14
Number of Participants with Correct Answer
Question
Pre-Test
(N = 24)
Post-Test
(N = 24)
Change
(N = 24)
Q1: What form is used to document the occurrence of
challenging behaviors in the classroom?
20 22 + 2
Q2: Who may be on the Regional Implementation
Team (RIT) (check all that apply)
4 3 - 1
Q3: If an EC observes that challenging behaviors stem
from classroom practices, the next step in the CB
Process is:
17 13 - 4
Q4: Which is NOT one of the purposes of the
Intervention Plan Meeting?
1 1 0
Q5: Which of the following is used to address
challenging behaviors that are serious safety concerns
and extreme? (check all that apply)
0 2 + 2
Q6: What is the purpose of the Functional Behavior
Assessment?
18 20 + 2
Q7: Who participates in a Functional Behavior
Assessment?
20 20 0
Q8: Intervention Plan Scenario Question 8 16 + 8
Q9: Individualized Support Plan Scenario Question 9 16 + 7
Q10: Intervention Plan Scenario Question 2 13 14 + 1
113
In regards to the first finding from the item analysis, not only did teachers
perform well on questions eight and nine (16 of 24 participants answered correctly)
at the time of the post-test, but they also demonstrated the largest gains on these two
questions. The change in number of teachers answering correctly was eight teachers
for question eight and seven teachers for question nine. A paired samples T-test
indicated the change in scores on item nine was not statistically significant, but that
the difference between pre- and post-test scores on item eight was statistically
significant [t(16) = 2.95, p = .009].
Despite showing positive overall knowledge gain and the gains on the two
scenario questions, the second finding indicated that only three teachers correctly
answered Item two, one teacher correctly answered Item 4, and two teachers
correctly answered Item 5. These three items were analyzed to determine how
participants answered the questions and what errors were made when answering each
question.
Item 2. Only 3 of the 24 respondents correctly answered Item 2 on the Post-
test Survey. Participant responses to Item 2 are presented in Figure 2. Analysis of
responses to Item 2: “Who may be on the Regional Implementation Team? (check all
that apply)” indicated that the majority of respondents correctly identified the
Educational Coordinator, Instructional Specialist, and Outreach Counselor as
members of the Regional Implementation Team (RIT). However, a large number of
respondents incorrectly identified classroom teachers as members of the RIT.
Furthermore, almost half of the respondents did not know that the Assistant
114
Education Coordinator may be on the RIT. This analysis suggests that there may be
confusion about the teacher’s role in the PBS process.
Figure 2. Pre- and Post-test answers to Knowledge Item 2: “Who may be on the
Regional Implementation Team (RIT)? (check all that apply)”
Item 4. Only 1 of the 24 respondents correctly answered Item 4 on the Post
test Survey. Participant responses to Item 4 are presented in Figure 3. Analysis of
responses to Item 4: “Which is NOT one of the purposes of the Intervention Plan
Meeting?” indicated that only one respondent correctly identified, “to respond to
challenging behaviors that are serious safety concerns and extreme.” The
remaining respondents overwhelming identified either, “to respond to challenging
behaviors that are minor and ongoing” or “To introduce the family to the PBS
21
12
4
19
15
21
20
11
4
18
14
20
0
2
4
6
8
10
12
14
16
18
20
22
24
EC AEC PBS Member OC Teacher IS
Number of Participants
Pre-Test
Post-Test
115
process”; both of which are, in fact, main goals of the Intervention Plan Meeting.
This suggests that teachers are confused about the purpose of the Intervention Plan
Meeting and may have confused the Intervention and Individual Support Plans.
Figure 3. Pre- and Post-test answers to Knowledge Item 4: “Which is NOT one of
the purposes of the Intervention Plan Meeting? (check all that apply)”
Item 5. Finally, only 2 of the 24 respondents correctly answered Item 5 on
the Post test Survey. Participant responses to Item 5 are presented in Figure 4.
Analysis of Item 5: “Which of the following is used to address challenging behaviors
that are serious safety concerns and extreme? (check all that apply)” indicated that
Number'of'participants
'
116
the majority of respondents correctly identified the Behavior Incident Report (BIR)
and immediate RIT meeting as tools to address this issue. Many fewer participants
were able to identify Immediate Support Guidelines, Functional Behavior Analysis
(FBA), and Individualized Support Plan as tools used to address this issue.
Furthermore, almost half of all respondents incorrectly identified the Intervention
Plan as a tool used to address this issue. These findings suggest that teachers are
confused about the forms and processes that are used to address extreme behaviors or
may be confused between the Intervention Plan and Individual Support Plan
processes.
Figure 4. Pre- and Post-test answers to Knowledge Item 5: “Which of the following
is used to address challenging behaviors that are serious safety concerns and
extreme? (check all that apply)”
Number of participants
117
Findings Related to Research Question 3a
Research Question 3a examined relationship between participants’
evaluation of the CB Process training and participants’ understanding of the CB
Process. The aim of Question 3a was to examine the relationship between
Kirkpatrick Levels 1 and 2. In particular, did participants’ who responded positively
to the training demonstrate increased knowledge about the CB Process after taking
the training? First, the change in participants’ knowledge scores from Pre- to Post-
test was calculated. These change scores were then correlated with participants’
responses to the six reaction questions analyzed in Research Question One (See
Table 6 for means and standard deviations). Pearson correlations indicated no
statistically significant relationships between any of the six reaction questions and
participants’ post-test knowledge scores. These results indicate that there was no
relationship between participants’ positive evaluation of the CB Process training and
their learning.
Findings Related to Research Question Four
Research Question 4 examined whether participants were able to successfully
implement the objectives taught in the CB Process training. The aim of the question
was to assess Kirkpatrick Level 3, transfer, by asking participants to rate their own
success at implementing the skills taught in the CB Process training. Analysis of
Research Question 4 was limited by the small number of Follow-up surveys that
were collected. Transfer was measured using one item on the follow-up survey.
This item stated, “I have been able to successfully implement the major objectives
118
taught in the Challenging Behavior Process Training.” Responses to this item were
determined using the following anchors: 1 = Strongly Disagree; 2 = Disagree; 3 =
Agree; 4 = Strongly Agree. 100% of the follow-up survey participants responded by
choosing “Agree”. Although the sample was small, these findings suggest that the
training was successful at creating transfer, or a change in behavior.
Summary of Findings
The current study resulted in several statistically significant positive findings.
First, by examining participants’ reactions to the training, the training materials,
interview data, and participants’ open ended responses, it appears that the CB
Process training was designed to effectively meet the needs of adult learners.
Second, the training effectively resulted in a change in knowledge about the CB
Process; knowledge scores were significantly higher after the training than they were
before the training. Third, although there were only a few teachers who responded to
the Follow-up survey, all agreed that they were able to successfully implement the
skills taught in the CB Process training. This suggests that the training did result in
behavior change; teachers successfully used their new skills in the classroom.
Despite these positive findings, there was one statistically significant
negative finding and some trends that are important to note. Despite positive
reactions to the CB Process training, the findings indicate that participation in the CB
Process was not related to immediate changes in teacher efficacy. However, teacher
efficacy in the area of Instructional Strategies did show statistically significant
119
negative change at Follow-up. Teachers who actually implemented the CB Process
in their classrooms demonstrated lower efficacy for instructional strategy use.
It is also noteworthy that despite the statistically significant increase in
knowledge about the CB Process, there were a number of questions that a majority of
teachers answered incorrectly even after the training. The three questions that
teachers were unable to answer dealt with basic factual knowledge about PBS and
the CB process and indicate a gap in understanding.
The next chapter will begin with a brief review of these findings. The
discussion will utilize the research literature on adult learning and effective
professional development to contextualize and give meaning to these findings.
Finally recommendations are made for both the organization’s professional
development planning and for future research.
120
CHAPTER FIVE
DISCUSSION
Research suggests that effective teachers can make a difference in changing
students outcomes (Darling-Hammond, 1999; Jordan, Mendro, & Weerasinghe,
1997; Kane, Rockoff, & Staiger, 2006; Sanders & Horn, 1998). Professional
development is one of the main avenues used to improve teachers’ efficacy and
impart knowledge and skills (Hill, 2007). Unfortunately, despite a great deal of
research on the subject of professional development, there are relatively few rigorous
studies from which we can draw conclusions about what makes professional
dvelopment effective at producing these types of changes (Garet, Porter, Desimone,
Birman, & Yoon, 2001; Guskey, 2009; Yoon, Duncan, Lee, Scarloss, & Shapley,
2007). The evaluation of professional development activities rarely extend beyond
surveys of teachers’ reactions to and satisfaction with the activity (Kutner et al.,
1997); making it difficult to accurately assess the impacts of professional
development on producing changes in classroom practice and student outcomes.
Furthermore, the complexities of studying professional development have made it
difficult to identify specific features of professional development that lead to positive
outcomes and to understand how those features operate in different contexts
(Guskey, 2000).
In response to these challenges, a number of researchers have suggested a
reasearch agenda that utilizes multiple studies that examine the design and
implementation of professional development activities, the contexts in which they
121
are implemented, and the measureable outcomes of such activities (Borko, 2004;
Penuel et al., 2007). Thus, the purpose of this study was to assess the efficacy of the
CB Process training in ways that went beyond simply measuring teacher reactions to
evaluating the effectiveness of the training at influencing both teachers’ knowledge
and teaching practice. This was accomplished by first ensuring thast the training
itself was well-designed to meet the needs of the CBECE teaching staff. The
researcher applied both the the Andragogy in Practice model (Knowles et al., 2005)
and Garet et al.’s (2001) features of effective development to complete this
assessment. Second, the success of the CB Process training was evaluated using
levels one, two, and three of the Kirkpatrick framework (2001, 2006; D. L.
Kirkpatrick & Kirkpatrick, 2007). This chapter presents an analysis of the study
findings, discusses implications for practice and future evaluations, and conclusions.
Summary of Findings
This section will briefly summarize the findings for the four research
questions and then discuss how these findings relate to one another in order to
provide a deep and nuanced evalauation of the CB Process training and its efficacy.
Design and Implementation of the CB Process Training
This study first examined the design and implementation of the CB Process
training to determine if the training met the needs of the teaching staff who
participated. Using both the Andragogy in Practice model (Knowles et al., 2005)
and Garet et al.’s (2001) six features of professional development to interpret the
122
results from research question one, it appears that this training was designed to be
effective and to meet the specific needs of the CBECE teaching staff.
Andragogy in Practice model. As discussed in Chapter Four, an
andragogical analysis of the CBECE teaching staff indicated that these teachers do
fit the core assumptions of the andragogical model. There were two important
factors that stood out in this analysis. The first was that the teachers in this sample
appeared to be highly self-directed. Therefore, the design of the training and follow-
up should have provided opportunities for self-directed study to meet these learners
needs. Analysis of survey data and the training materials indicated that the CB
Process training did account for this factor. During the training itself, there were two
opportunities for teachers to collaborate with others to discuss and respond to
common scenarios encountered in during the CB Process. Follow-up survey data
also indicated that teachers were given follow-up coaching opportunities, by both the
RIT members and colleagues, that may have provided opportunities for teachers’ to
engage in self-directed experiences and to use their own experiences as a resource in
the learning process.
Second, 50% of the teaching staff have 20 or more years of experience in
ECE. Stage theory (Huberman, 1989; Villegas-Reimers, 2003) suggests that these
teachers may demonstrate lower engagement or be skeptical of innovation. For the
CB Process training to be most effective, extra attention should have been paid to
bolstering motivation for learning with these participants. Because the training was
geared to meet the expressed needs of the teaching staff and was very problem-
123
centered, this most likely provided the necessary motivation for highly experienced
teachers to participate fully.
Six features of effective professional development. As demonstrated in
Chapter Four the CB Process training met five of the six criteria for effective
professional development (see Table 11). Although the training itself was a short,
one-shot, scripted workshop, it did provide some opportunity for active learning and
collective participation, was coherent with other learning activities, and was part of
ongoing professional development related to the central theme of PBS. The essential
factor that helped the CB Process training to meet four of the six criteria for
effectiveness was the availability and use of follow-up support, coaching and
collaboration. The data indicated that all members of the Follow-up sample received
support and coaching through the CB Process.
While Garet et al. (2001) note that the types of professional development that
are most effective are content-focused, this training was, by neccessity, process
oriented. However, the training did draw from the existing literature regarding
teaching complex processes. Since teachers had already received training in
different skills necessary for participation in PBS, the CB Process training went the
extra step suggested by Kinkaid et al. (2006) of providing teachers with a more
complete picture of the process itself. Furthermore, the follow-up coaching support
provided to teachers implementing the CB Process was also found to be effective in
studies of process focused trainings.
124
Evaluation of the CB Process Training
After using levels one, two, and three of the Kirkpatrick Framework to
evaluate the CB Process training, the findings of this study suggest that teachers had
very positive reactions to the training. Furthermore, the findings suggest that the
training was successful in making teachers more knowledgeable about the CB
Process and in helping teachers to successfully implement the CB process in their
classrooms. However, after the training, there were still noteworthy gaps in
teachers’ understanding of the CB Process. The finding that the training did not
result in immediate changes in teacher efficacy was also noteable. In fact, the results
of the study show a statistically significant decrease in teacher efficacy in the area of
Instructional Strategies at the time of follow-up.
Putting the Pieces Together: Was the Training Successful?
In order to determine whether the CB Process training was successful, it is
neccesary to consider the findings summarized above as a whole, to utilize research
to inform these findings, and to then consider contextual data that might shed light
on the the interpretation of results.
What does it mean to be successful? An initial observation about the use of
teacher self-report to determine the success of the CB Process training is that
teachers may define success differently from one another. In this case, did teachers
who felt successful feel this way because: (1) they were able to implement the CB
Process with fidelity; (2) they observed positive changes in student behavior; or (3)
the CB Process resulted in the student being removed from the classroom (i.e., a
125
more appropriate educational setting was necessary)? In this case, interview data or
more nuanced survey questions may be able to provide a greater depth of
information about how teachers define success in the CB Process.
Teacher efficacy. The statistically non-significant results for changes in
efficacy at the time of the Post-test have a number of possible explanations. The first
explanation that should be considered is that the small sample size may have
hindered the researcher’s ability to find statistically significant results. This
explanation is unlikely given the remarkable lack of change in mean scores from pre-
to post-test (See Table 13).
The second explanation that should be considered is that the that teachers’
initial high levels of efficacy led them to be very sure of their ability to implement
processes discussed in the CB Process training. Research which indicates that
teachers with similarly high levels of efficacy are more likely to embrace and
implement innovative techniques (Allinder, 1995) and to feel in control of situations
(Tschannen-Moran & Woolfolk Hoy, 2001) provides support for this explanation.
The CBECE teachers had already received training for and practiced the skills used
in the CB Process and felt confident in their abilities; the CB Process training itself
simply provided a framework for teachers to apply what they had learned previously.
The nature of this training simply provided better information on how and when to
implement previously learned skills and did not result in changes to teacher efficacy.
Why did efficacy decrease at follow-up? So, why then was there a
statistically significant drop in the Teaching Strategies subscale at the time of follow-
126
up? There are three possible explanations for this drop in efficacy. The first is that
the CB Process training was ultimately ineffective at preparing teachers to
implement the CB Process. This explanation is not supported by the data, which
indicate that all seven respondents indicated that they felt their implementation of the
CB Process was successful. Therefore, it is likely that there is another explanation
for this decline in efficacy. Allinder (1995) and Guskey (1988) each found that
when implementing new innovations, changes in teacher efficacy appear to be
curvilinear. The initial challenges associated with implementation have a negative
effect on efficacy. As teachers see evidence of student learning and develop
strategies to cope with the changes, teaching efficacy begins to increase. While the
PBS generally addresses behavior management, the CB Process requires teachers to
implement new strategies for teaching and assessment. The challenges of doing so
in the classroom my be the reason for this decrease in efficacy for teaching
strategies. If this explanation is valid, then it will be important for the CBECE
division to ensure that teachers are adequately supported throughout the CB Process.
Gusky’s (2000) model of teacher change indicates the importance of providing
encouragement and ongoing support for new practices. This type of support can
increase the likelihood of improved student outcomes which then contribute to the
eventual rise in teacher efficacy.
Yet another explanation is that these results may be skewed or unreliable due
to the very small sample size. With only seven respondents, one or two teachers
127
whose efficacy was negatively affected can skew the mean efficacy scores and lead
to a false finding of significance.
Either of the last two explanations can explain the statistically significant
decrease in efficacy for Teaching Strategies at the time of follow-up; both
explanations require more information in order to reach a conclusion. The
implications for the CBECE division are that more information is needed. To better
explore the hypothesis of curvilinear development of teaching efficacy, further
follow-up assessment is necessary to see if teachers receive adequate support and if
teachers experience the predicted rise in efficacy as the school year progresses. Even
more helpful would be to examine the efficacy of these teachers after they complete
the CB process with another child. Will their efficacy increase with more experience
with the entire process? In order to determine the validity of the small sample size
explanation it will be necessary to continue to collect data from more teachers as
they go through the CB Process. As more data are gathered, further analysis can
determine if the findings of the current study hold up with a larger sample.
Teacher knowledge: are the gaps in knowledge significant? As
mentioned earlier in this chapter and in Chapter Four, there was a statistically
significant increase in knowledge about the CB Process from pre- to post-test.
However, despite this statistically significant increase, a majority of teachers
answered questions two, four, and five incorrectly. These three questions are basic,
factual questions about the CB Process that were not only covered in this particular
training, but were also covered in prior trainings as well (personal communication,
128
Director of Instructional Support, October 26, 2011). An interesting contrast to
teachers’ performance on these factual questions, was their performance on two
scenario questions (questions eight and nine) that asked teachers’ to apply their
understanding of the CB Process to possible classroom scenarios. These findings
indicate that despite low scores on knowledge questions, participants were able to
correctly solve common scenario problems. When considered alongside the follow-
up test results that indicated 100% of the teachers in the follow-up sample felt that
they were successful at implementing the objectives taught in the CB Process
training, the findings suggest that teachers’ performance on scenario items may be
more predictive of their success at implementing the process than performance on
knowledge questions. However, the size of the follow-up sample was quite small, so
conclusions drawn from these results must be made with caution. As with teacher
efficacy, it will be necessary to continue to collect data from more teachers as they
go through the CB Process. As more data are gathered, further analysis can
determine if the findings of the current study hold up with a larger sample.
Alternative suggestions for interpreting the findings about teachers’
knowledge will be explored in the next section.
Referral rates: surprising results. One suprising finding of the study did
not stem from any of the original research questions. As noted in Chapter Three,
using past data the researcher estimated that the number of students whose behavior
necessitate the implementation of the CB Process ranged from 23 to 83 students in
any given year. This estimate was taken from the PBS literature (3 – 5% of students
129
may need additional supports) and the CBECE 2007 – 2008 school year data (11%
of CBECE students needed additional support). The actual referral rate from
August to November 2011 was seven students total; two region reported no students
who needed additional support. This low referral rate contributed to the small size of
the follow-up sample and the difficulty in arriving at conclusions regarding the
effectiveness of the CB Process traing at Kirkpatrick Level 3, Transfer.
There are four possible explanations for the low numbers of students who
needed the supports offered by the CB Process. Explanations one, three, and four
relate directly to the findings for research question three. These explanations also
suggest implications for both future research and future evaluations of professional
development related to the CBECE PBS initiative.
Explanation one: PBS is working! The first explanation is a positive one,
perhaps the low numbers of students who require additional support are an artifact of
the successful implementation of the PBS system. If teachers effectively
implemented the proactive portions of the PBS system, then the challenging
behaviors reported prior to the implementation of PBS may have been addressed and
resolved before any intervention was necessary.
Of the four explanations offered here, this first explanation is the least likely.
The PBS research literature indicates that even when PBS is working well, about 3 –
5% of the student population will require the type of increased support provided by
the CB Process (Stormont et al., 2008). One way to gain more insight into this
explanation would be to review student outcome data at the end of the school year.
130
Because of the time constraints for this study, the data were not available. However,
reviewing the data collected on students social and emotional development during
the course of the school year might provide confirming or disconfirming evidence
for this hypothesis.
Explanation two: Chance. The second explanation is that, by chance, there
are just fewer students who exhibit challenging behaviors in this years student
population. Expanding the scope of the current evaluation to include historical data
on referral rates both prior to and since the PBS system was introduced can shed
light on whether this level of referrals is an artifact of chance.
Explanation three: Teachers did not understand the process. The third
explanation is that teachers’ low scores on the factual knowledge questions at the
time of the post-test did in fact indicate a lack of understanding about the CB
Process. The scenario questions that teachers performed well on referenced portions
of the CB Process that occur after the initial decision is made to provide further
support to a child who exhibits challenging behaviors. Perhaps because teachers do
not have the factual understanding about which forms to use (question five) or who
the support tream members are (question two), they are unsuccessful at accurately
identifying and referring students into the CB Process.
Interviews with the RIT team members and data from the current year’s mid-
year and end-of-year PBS Teacher Survey, which were not within the scope of this
study, can provide more information about whether or not teachers actually
understand the CB Process. It is the responsibility to the RIT to review all BIRs and
131
to conduct a variety of classroom observations, both related and unrelated to PBS.
Therefore the RIT may have insights into teachers actual understanding of the CB
Process. Likewise, the PBS Teacher Survey contains information about
implementing the entire PBS system and specifically about the CB process that may
also provide evidence to confirm or disconfirm this hypothesis.
Explanation four: Teachers opted out of participating. The fourth
explanation is that teachers did understand the CB Process, but for unknown reasons
chose not to participate in the process. In other words, teachers chose to opt out of
the PBS process and either did not address challenging behaviors or used alternate
means of addressing challenging behaviors in the classroom.
Teacher focus groups can provide more nuanced information that might
indicate teachers’ comfort level with PBS and the CB Process, indications of whether
teachers are opting out, as well as provide opportunities to gain a deeper
understanding of teachers’ experiences with and feelings about the CB Process. It
may also be possible to gather this information by surveying the entire teaching staff,
not just those who implemented the CB Process, at follow-up. A general follow-up
survey can include questions about alternative methods of addressing challenging
behaviors that may indicate whether teachers are opting out of the process.
All four of these explanations are plausible, and all require more information
to reach a conclusion. The next section summarizes these implications and offers
next steps.
132
Suggestions and Next Steps
This study specifically examined the efficacy of the CB Process training in
relation to changing teacher attitudes, increasing teachers knowledge, and the
successful implementation of the CB process in the classroom. There are several
suggestions for strengthening and/or extending the study.
First, a small sample size, especially for the follow-up portion of the study,
hindered the researcher’s ability to uncover statistically significant results and to
arrive at any definitive conclusions. While there were a number of statistically
significant findings, a larger sample size would have increased the significance of the
findings and perhaps uncovered even more statistically significant results. Based on
the high response rate in Regions 4 & 5 to the pre-test, which was presented in-
person using a paper and pencil format, a higher response rate might be obtained by
having participants complete paper-and-pencil surveys that are collected
immediately before and after the training. Follow-up surveys could be administered
at a staff meeting in the same manner. The use of participant interviews would have
also provided further insight into the research questions explored in this study.
However, privacy and anonymity have always been a significant staff concern when
responding to surveys and workplace evaluations (personal communication, J. Doe,
2009). Exploring ways of ensuring anonymity in the interview and/or focus group
process will be an important step for future evaluation efforts.
Second, successful implementation of the CB Process was determined using a
self-report question. As noted in the resesarch, self-report is often an unreliable
133
measure. Furthermore, as discussed earlier, it was unclear how the teachers surveyed
may have defined success. Bolstering the self-report data with interviews or surveys
of the RIT members regarding the implementation of the CB process would
strengthen the conclusions reagarding effective transfer to practice. As mentioned
earlier, future evaluation efforts will need to explore methods of ensuring
confidentiality and anonymity in the interview process.
Third, continued support and follow-up may be a key factor to changing
teacher behaviors and ultimately impacting student outcomes. Teachers indicated a
desire for a variety of follow-up supports that included coaching, collaboration, and
increased planning time. Those who felt successful at implementing the CB Process
reported receiving this type of follow-up support; a few open ended responses
specifically identified support and follow-up as helpful when implementing the CB
Process. The research also indicates that in order for teachers to experience the
positive outcomes that will lead to increases in efficacy, follow-up training and
support are vital. As noted earlier, it was the continued support and coaching
provided to members of the Follow-up sample that helped the CB Process training to
meet the criteria set forth by Garet et al. (2001). Perhaps the follow-up support and
coaching were the main treatment factors affecting success. Future evaluation
efforts can utilize interviews, focus groups, and more detailed surveys to provide
more clarity about what parts of the training were most helpful. A larger sample that
allows the researcher to disaggregate the sample according to type, frequency, and
quality of support may also help to paint a clearer picture.
134
Fourth, the researcher uncovered surprising information about the referral
rate for children with challenging behaviors. There are a number of explanations for
this low rate. The most positive explanation is that the PBS system is working and
challenging behaviors have decreased. The low rate may also be an artifact of
random variations in the year to year occurance of challenging behaviors. However,
if the low reporting rate is due to a lack of understanding of the CB Process or, even
worse, a result of teachers choosing to opt out of participating in the PBS system at
all the implications are quite serious. Futher study of this phenomenon is warranted
to ensure that all students who need additional support for challenging behaviors
receive the support to which they are entitled. Future evaluation efforts should
include a variety of data sources that can provide information about this
phenomenon. Historical data on referral rates and Teacher PBS Survey data that are
already collected by the CBECE division can inform the conclusions. Follow-up
surveys of all teachers should be conducted and should include questions about a
variety of methods that teachers are using to address challenging behaviors. Finally,
teacher focus groups and RIT interviews and survey data can provide another data
source about how and if the CB process is being implemented.
Finally, as noted in Chapter One, the most important level of evaluation in
the Kirkpatrick framework is level four data. Improved student outcomes are the
gold standard when determining the if a professional development activity has
delivered on its investments. The current study was unable to address changes in
student outcomes that were related to teachers’ participation in the CB Process
135
Training. The time constraints on the study did not allow for collection and analysis
of this data. The CBECE division currently collects a wide array of student data
throughout the year; this data is available at the end of each school year. In order to
strentghen the conclusions about the efficacy of the CB Process training the available
student outcome data should be examined at the end of the school year and
correlated with the data collected in this study to provide a complete evaluation of
the efficacy of the CB Process training.
Conclusions
This study further contributed to what we already know about the evaluation
of professional development activities. The findings of the current study were
supported by the existing literature and revealed a number of important issues that
require further investigation.
The most important conclusion that can be drawn from this investigation is
that comprehensive evaluation of professional development efforts is important. As
mentioned earlier, CBECE invests a great deal of time, effort, and resources towards
both professional development and data collection. However, there has not been a
great deal of attention paid toward putting the pieces of the puzzle together. As
demonstrated in this study, linking data collection to the evaluation of professional
development efforts can: (1) provide positive feedback and evidence about CBECE’s
professional development efforts; (2) spotlight possible areas of concern; and (3)
uncover trends and phenomenon that may have been previously overlooked.
136
Second, as indicated by the research literature about the theory of Andragogy,
it is important that professional development activities are designed with the needs of
the adult learner in mind and then adjusted to meet the particular needs of the
participants. Doing so helps to ensure the quality of the activity and can help to
better interpret the results of any evaluations that are conducted. For example,
andragogical and effectiveness analyses of the CB Process training conducted in this
study indicated that the training was designed in ways to best meet the needs of the
teachers who participated. So, when evaluating the efficacy of the CB Process
training using the Kirkpatrick framework, the researcher could eliminate poor design
and delivery as possible explanations for a lack of change in teacher attitudes,
knowledge, and behaviors.
Third, more data are necessary to reach more definitive conclusions about the
efficacy of the CB Process Training. Increasing the response rate, obtaining a wider
array of data, and extending the length of the study would provide more and better
data with which to complete this analysis. It was also apparent in this study, that the
addition of interview and/or observation data would have greatly increased the
ability of the researcher to draw definitive conclusions. Although, the use of such
data to draw conclusions about a complex process poses some difficulty, the depth of
information gained would appear to outweigh the difficulties. The conclusions about
both level two and three effectiveness could be bolstered and a level four analysis of
student outcomes could be conducted. Furthermore, additional data can also shed
light on the problem of low referral rates that were reveled in this study.
137
Overall, the purpose of this study was accomplished. The study revealed that
the CB process training was well planned. Furthermore, teachers reacted well to the
training, demonstrated improvements in overall knowledge about the CB Process and
performed well on questions that asked them to apply what they knew to real world
scenarios. The results also offered limited evidence that participation in the CB
Process was related to the successful implementation of the training objectives. The
study also demonstrated the utility of undertaking comprehensive evaluations of
professional development efforts. KS CBECE invests a great deal in its teachers and
students; this study put the pieces of the puzzle together to show how those
investments were successful, areas for growth, and areas that need further
investigation. Results of this study will be shared with the CBECE division and the
regions that participated. Hopefully these positive findings will: (1) encourage
teachers to continue in their efforts to implement the PBS system, (2) encourage the
CBECE administrators to investigate these findings further, and (3) encourage the
continued use of comprehensive evaluations of professional development to ensure
that we are investing in activities that will result in positive outcomes for our
teachers and students.
138
REFERENCES
About Kamehameha Schools. Retrieved December 1, 2009, from
http://www.ksbe.edu/about/facts.php
Allinder, R. M. (1995). An examination of the relationship between teacher efficacy
and curriculum-based measurement and student achievement. Remedial and
Special Education, 16(4), 247.
Americans with Disabilities Act of 1990, As Amended., Pub L. No. 110-325, (2009).
Armor, D. J. (1976). Analysis of the school preferred reading program in selected
Los Angeles minority schools. Santa Monica, CA: Rand Corporation.
Ball, D., & Cohen, D. (1999). Developing practice, developing practitioners: Toward
a practice-based theory of professional education. Teaching as the learning
profession: Handbook of policy and practice, 1, 3–22.
Bandura, A. (1977). Self-efficacy: toward a unifying theory of behavioral change.
Psychological review, 84(2), 191-215.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive
theory. Englewood Cliffs, NJ Prentice-Hall.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York, NY: W. H.
Freeman and Company.
Barnett, W. S. (2003). Better teachers, better preschools: Student achievement linked
to teacher qualifications. Preschool Policy Matters, 2, 1-12.
Barnett, W. S. (2008). Preschool education and its lasting effects: Research and
policy implications. Boulder and Tempe, CO: Education and the Public
Interest Center & Education Policy Research Unit.
Berman, P., & McLaughlin, M. W. (1977). Federal programs supporting
educational change, Vol. VII: Factors affecting implementation and
continuation (Vol. R-1589/7-HEW). Santa Monica, CA: Rand Corporation.
Blank, R., De las Alas, N., & Smith, C. (2008). Does teacher professional
development have effects on teaching and learning?: Analysis of evaluation
findings from programs for mathematics and science teachers in 14 states.
Washington, DC: Council of Chief State School Officers.
139
Borko, H. (2004). Professional development and teacher learning: Mapping the
terrain. Educational Researcher, 33(8), 3.
Bowman, B. T., Donovan, S., & Burns, M. S. (Eds.). (2001). Eager to learn :
educating our preschoolers. Washington, DC: National Academy Press.
Brookfield, S. (1986). Understanding and Facilitating Adult Learning. San
Francisco, CA: Jossey -Bass.
Buysse, V., Winton, P., & Rous, B. (2009). Reaching consensus on a definition of
professional development for the early childhood field. Topics in Early
Childhood Special Education, 28(4), 235.
Caprara, G. V., Barbaranelli, C., Steca, P., & Malone, P. S. (2006). Teachers' self-
efficacy beliefs as determinants of job satisfaction and students' academic
achievement: A study at the school level. Journal of School Psychology,
44(6), 473-490.
Clotfelter, C. T., Ladd, H., & Vigdor, J. L. (2007). How and why do teacher
credentials matter for student achievement? (Working Paper No. 12828).
Retrieved from National Bureau of Economic Research website:
http://www.nber.org/papers/w12828.
Copple, C., & Bredekamp, S. (Eds.). (2009). Developmentally appropriate practice
in early childhood education programs (3rd ed.). Washington, DC: NAEYC.
Creswell, J. W. (2003). Research design (2nd ed.). Thousand Oaks, CA: Sage
Publications.
Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003).
Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie
(Eds.), Handbook of mixed methods in social and behavioral research (pp.
209-240). Thousand Oaks, CA: Sage Publications.
Darling-Hammond, L. (1999). Teacher quality and student achievement: A review of
state policy evidence: Center for the Study of Teaching and Policy,
University of Washington Seattle, WA.
DeForest, P. A., & Hughes, J. N. (1992). The effect of teacher involvement and
teacher self-efficacy on ratings of consultant effectiveness and intervention
acceptability. Journal of Educational and Psychological Consultation, 3(4),
301-316.
140
Dembo, M. H., & Gibson, S. (1985). Teachers' sense of efficacy: An important factor
in school improvement. The Elementary School Journal, 86(2), 173-184.
Desimone, L. (2002). How can comprehensive school reform models be successfully
implemented? Review of educational research, 72(3), 433.
Desimone, L., Porter, A., Garet, M., Yoon, K., & Birman, B. (2002). Effects of
professional development on teachers' instruction: Results from a three-year
longitudinal study. Educational evaluation and policy analysis, 24(2), 81-
112.
Dukes, C., Rosenberg, H., & Brady, M. (2008). Effects of Training in Functional
Behavior Assessment. International Journal of Special Education, 23(1), 11.
Early, D., Bryant, D., Pianta, R., Clifford, R., Burchinal, M., Ritchie, S., et al.
(2006). Are teachers’ education, major, and credentials related to classroom
quality and children's academic gains in pre-kindergarten? Early Childhood
Research Quarterly, 21(2), 174-195.
Fox, L., Dunlap, G., Hemmeter, M., Joseph, G., & Strain, P. (2003). The Teaching
Pyramid: A Model for Supporting Social Competence and Preventing
Challenging Behavior in Young Children. Young Children, 58(4), 48-52.
Garet, M., Birman, B., Porter, A., Desimone, L., & Herman, R. (1999). Designing
Effective Professional Development: Lessons from the Eisenhower Program
[and] Technical Appendices.
Garet, M., Porter, A., Desimone, L., Birman, B., & Yoon, K. (2001). What makes
professional development effective? Results from a national sample of
teachers. American Educational Research Journal, 38(4), 915-945.
Ghaith, G., & Yaghi, H. (1997). Relationships among experience, teacher efficacy,
and attitudes toward the implementation of instructional innovation. Teaching
and teacher education, 13(4), 451-458.
Gibson, S., & Dembo, M. H. (1984). Teacher efficacy: A construct validation.
Journal of educational psychology, 76(4), 569-582.
Good Beginnings Alliance. (2007). Fact Sheet: Children, Families and Early
Childhood Education in Hawaii (pp. 4). Honolulu, HI: Good Beginnings
Alliance.
141
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual
framework for mixed-method evaluation designs. Educational evaluation and
policy analysis, 11(3), 255.
Guskey, T. R. (1988). Teacher efficacy, self-concept, and attitudes toward the
implementation of instructional innovation. Teaching and teacher education,
4(1), 63-69.
Guskey, T. R. (1994). Professional Development in Education: In Search of the
Optimal Mix. Paper presented at the AERA.
Guskey, T. R. (2000). Evaluating professional development. Thousand Oaks, CA:
Corwin Press.
Guskey, T. R. (2003). Analyzing lists of the characteristics of effective professional
development to promote visionary leadership. NASSP Bulletin, 87(637), 4.
Guskey, T. R. (2009). Closing the Knowledge Gap on Effective Professional
Development. Educational Horizons, 224-233.
Hawaii Good Beginnings, Interdepartmental Council, & School Readiness Task
Force. (2006). Hawai'i Preschool Content Standards. Retrieved from
http://www.goodbeginnings.org/images/uploads/Preschool_Standards_2006.p
df.
Hill, H. (2007). Learning in the teaching workforce. The Future of Children, 17(1),
111-127.
Hill, H. (2009). Fixing teacher professional development. Phi Delta Kappan, 90(7),
470-476.
Hogan, M. (2003). New Freedom Commission Report: the President's New Freedom
Commission: recommendations to transform mental health care in America.
Psychiatric Services, 54(11), 1467.
Ingalls, J. D., & Arceri, J. M. (1972). A Trainers Guide To Andragogy, Its Concepts,
Experience and Application.
Iwata, B. A., Wallace, M. D., Kahng, S. W., Lindberg, J., Roscoe, E., Conners, J., et
al. (2000). Skill acquisition in the implementation of functional analysis
methodology. Journal of Applied Behavior Analysis, 33(2), 181.
142
Jordan, H., Mendro, R., & Weerasinghe, D. (1997). Teacher effects on longitudinal
student achievement. Kemmis, S., & Wilkinson, M. (1998). Participatory
action research and the study of practice. Action research in practice:
Partnerships for social justice in education, 21-36.
Kamehameha Schools. (2008). Kamehameha Schools Annual Report: July 1, 2007 -
June 30, 2008. Honolulu, HI: Kamehameha Schools.
Kamehameha Schools. (2009a). Positive Behavior Support (PBS): Staff Survey.
Planning and Accountability Branch. Kamehameha Schools. Honolulu, HI.
Kamehameha Schools. (2009b). Positive Behavior Support Initiative: 2008 - 2009
Report. Community-Based Early Childhood Education. Kamehameha
Schools. Honolulu, HI.
Kamehameha Schools. (2009c). Student and Parent Handbook. Community Based
Early Childhood Education Division. Kamehameha Schools. Honolulu, HI.
Kane, T. J., Rockoff, J. E., & Staiger, D. O. (2006). What does certification tell us
about teacher effectiveness? Evidence from New York City: National Bureau
of Economic Research.
Kincaid, D., Peshak George, H., & Childs, K. (2006). Review of the Positive
Behavior Support Training Curriculum. Journal of Positive Behavior
Interventions, 8(3), 183.
Kirkpatrick, D. (2001). The four-level evaluation process. In L. L. Ukens (Ed.), What
Smart Trainers Know: The Secrets of Success from the World’s Foremost
Experts (pp. 122-132). San Francisco, CA: Jossey-Bass/Pfeiffer.
Kirkpatrick, D. (2006). Seven keys to unlock the four levels of evaluation.
Performance Improvement, 45(7), 5-8.
Kirkpatrick, D. L., & Kirkpatrick, J. D. (2007). Implementing the four levels: A
practical guide for effective evaluation of training programs. San Francisco,
CA: Berrett-Koehler.
Knowles, M. S. (1978). Andragogy: Adult learning theory in perspective.
Community College Review, 5(3), 9-20.
Knowles, M. S., Holton, E. F., & Swanson, R. A. (1998). The Adult Learner: The
Definitive Classic in Adult Education and Human Resource Development.
(5th ed.). Woburn, MA: Butterworth-Heinemann.
143
Knowles, M. S., Holton, E. F., & Swanson, R. A. (2005). The adult learner: The
definitive classic in adult education and human resource development
Kutner, M., Sherman, R., Tibbets, J., & Condelli, L. (1997). Evaluating professional
development: A framework for adult education. Washington, DC: Pelavin
Research Institute.
Lewis, L., Parsad, B., Carey, N., Bartfai, N., Farris, E., Smerdon, B., et al. (1999).
Teacher quality: A report on the preparation and qualifications of public
school teachers (NCES 1999-080). US Department of Education.
Washington, DC: National Center for Education Statistics.
Lincoln, Y. S., & Guba, E. G. (2000). Paradigmatic controversies, contradictions,
and emerging confluences. Handbook of qualitative research, 2, 163-188.
Mayer, G. R., Crews, S. D., Cook, C. R., Gale, B., Kraemer, B. R., & Wright, D. B.
(2007). A preliminary study on the effects of training using behavior support
plan quality evaluation guide (BSP-QE) to improve positive behavioral
support plans. Education and Treatment of Children, 30(3), 89-106.
Meijer, C. J. W., & Foster, S. F. (1988). The effect of teacher self-efficacy on
referral chance. The Journal of Special Education, 22(3), 378.
Merriam, S. B., Caffarella, R. S., & Baumgartner, L. M. (2007). Learning in
adulthood: A comprehensive guide (3rd ed.). San Francisco, CA: Jossey-
Bass.
Mertens, D. M. (2003). Mixed methods and the politics of human research: The
transformative-emancipatory perspective. Handbook of Mixed Methods in
Social and Behavioral Research, 135–164.
Moore, J. W., Edwards, R. P., Sterling-Turner, H. E., Riley, J., DuBard, M., &
McGeorge, A. (2002). Teacher acquisition of functional analysis
methodology. Journal of Applied Behavior Analysis, 35(1), 73.
NAEYC. (2003). Early childhood curriculum, assessment, and program evaluation:
Building a effective, accountable system for children birth through age 8,
from http://208.118.177.216/about/positions/pdf/CAPEexpand.pdf
No Child Left Behind Act of 2001, Pub L. No. 107-110 C.F.R. (2001).
144
Parsad, B., Lewis, L., Farris, E., & Greene, B. (2001). Teacher preparation and
professional development. National Center for Educational Center for
Education Statistics. Washington DC (NCES: 2001-088).
Patton, M. Q. (2002). Qualitative research and evaluation methods. Thousand Oaks,
CA: Sage Publications.
Penuel, W., Fishman, B., Yamaguchi, R., & Gallagher, L. (2007). What makes
professional development effective? Strategies that foster curriculum
implementation. American Educational Research Journal, 44(4), 921.
Podell, D. M., & Soodak, L. C. (1993). Teacher efficacy and bias in special
education referrals. The journal of educational research, 86(4), 247-253.
Qi, H., & Kaiser, A. (2003). Behavior problems of preschool children from low-
income families: Review of the literature. Topics for Early Childhood Special
Education, 23(4), 188.
Raver, C. (2002). Emotions matter: Making the case for the role of young children's
emotional development for early school readiness. Working Papers 0206.
Harris School of Public Policy Studies, University of Chicago. Chicago, IL.
Retrieved from http://harrisschool.uchicago.edu/about/publications/working-
papers/pdf/wp_02_06.pdf
Raver, C., & Knitzer, J. (2002). Ready to enter: What research tells policymakers
about strategies to promote social and emotional school readiness among
three-and four-year-olds, (Working Paper 0205). Harris School of Public
Policy Studies, University of Chicago. Chicago, IL. Retrieved from
http://harrisschool.uchicago.edu/about/publications/working-
papers/pdf/wp_02_05.pdf
Richter, D., Kunter, M., Klusmann, U., Lüdtke, O., & Baumert, J. (2010).
Professional development across the teaching career: Teachers' uptake of
formal and informal learning opportunities. Teaching and teacher education.
Ross, J., & Bruce, C. (2007). Professional development effects on teacher efficacy:
Results of randomized field trial. The journal of educational research,
101(1), 50-60.
Sanders, W. L., & Horn, S. P. (1998). Research findings from the Tennessee Value-
Added Assessment System (TVAAS) database: Implications for educational
evaluation and research. Journal of Personnel Evaluation in Education,
12(3), 247-256.
145
Scott, T. M., Nelson, C. M., & Zabala, J. (2003). Functional Behavior Assessment
Training in Public Schools. Journal of Positive Behavior Interventions, 5(4),
216.
Skaalvik, E. M., & Skaalvik, S. (2007). Dimensions of teacher self-efficacy and
relations with strain factors, perceived collective teacher efficacy, and teacher
burnout. Journal of educational psychology, 99(3), 611.
Snider, M., & Fu, V. (1990). The effects of specialized education and job experience
on early childhood teachers’ knowledge of Developmentally Appropriate
Practice. Early Childhood Research Quarterly, 5(1), 69-78.
Spillane, J. (1999). External reform initiatives and teachers efforts to reconstruct
their practice: the mediating role of teachers zones of enactment. Journal of
Curriculum Studies, 31(2), 143-175.
Stormont, M., Lewis, T., Beckner, R., & Johnson, N.W. (2008). Implementing
positive behavior support systems in early childhood and elementary settings.
Thousand Oaks, CA: Corwin Press.
Tashakkori, A., & Teddlie, C. (1998) Mixed methodology: Combining qualitative
and quantitative approaches. Vol. 46. Applied Social Research Methods
Series. Thousand Oaks, CA: Sage Publishing.
Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social &
behavioral research. Thousand Oaks, CA: Sage Publications.
Technical Assistance Center on Social Emotional Intervention for Young Children
(TACSEI). (2004). Facts about young children with challenging behaviors,
from
http://www.challengingbehavior.org/do/resources/documents/facts_about_sh
eet.pdf
The Teaching Commission. (2004). Teaching at Risk: A Call to Action. New York,
NY: The Teaching Commission, The CUNY Graduate Center.
Tschannen-Moran, M., & Woolfolk Hoy, A. (2001). Teacher efficacy: Capturing an
elusive construct. Teaching and teacher education, 17(7), 783-805.
Tschannen-Moran, M., Woolfolk Hoy, A., & Hoy, W. K. (1998). Teacher efficacy:
Its meaning and measure. Review of educational research, 68(2), 202.
146
Villegas-Reimers, E. (2003). Teacher professional development: an international
review of the literature: UNESCO: International Institute for Educational
Planning.
Whitehurst, G. J. (2002). Scientifically based research on teacher quality: Research
on teacher preparation and professional development. Paper presented at the
White House Conference on Preparing Tomorrow’s Teachers.
Wilson, S., & Berne, J. (1999). Teacher learning and the acquisition of professional
knowledge: An examination of research on contemporary professional
development. Review of research in education, 24, 173-209.
Winton, P., & McCollum, J. (2008). Preparing and supporting high quality early
childhood practitioners: Issues and evidence. Practical approaches to early
childhood professional development: Evidence, strategies, and resources, 1–
12.
Wohlstetter, P., Datnow, A., & Park, V. (2008). Creating a system for data-driven
decision-making: applying the principal-agent framework. School
Effectiveness and School Improvement, 19(3), 239-259.
Woolfolk, A. E., & Hoy, W. K. (1990). Prospective teachers' sense of efficacy and
beliefs about control. Journal of educational psychology, 82(1), 81.
Yoon, K., Duncan, T., Lee, S., Scarloss, B., & Shapley, K. (2007). Reviewing the
evidence on how teacher professional development affects student
achievement (Issues & Answers Report, REL 2007–No. 033). Washington,
DC: US Department of Education, Institute of Education Sciences. National
Center for Education Evaluation and Regional Assistance, Regional
Educational Laboratory Southwest. Retrieved from
http://ies.ed.gov/ncee/edlabs, 10(9).
Zins, J., Bloodworth, M., Weissberg, R., & Walberg, H. (2004). The scientific base
linking social and emotional learning to school success. Building academic
success on social and emotional learning: What does the research say, 3–22.
147
APPENDIX A
ANDRAGOGICAL LEARNER ANALYSIS
Andragogical
Principle
Applies?
Expected Influence of
Individual and Situational Differences Goals and Purposes for Learning
Subject
Matter
Individual
Learner Situational Individual Institutional Societal
Adults need to
know why they
need to learn
something
The self-concept
of adults is
heavily
dependent upon
a move toward
self-direction
Prior
experiences of
the learner
provide a rich
resource for
learning
Adults typically
become ready to
learn when they
experience a
need to cope
with a life
situation or
perform a task
Adults
orientation to
learning is life /
problem
centered
The motivation
for adult learners
is internal rather
than external.
148
APPENDIX B
INTERVIEW PROTOCOL
1. Question: What is the topic of the professional development/training?
2. Question: Why was this particular training needed or developed?
3. Question: Who developed the training?
a. Is that person (s) facilitating the training?
4. Question: How was the training “rolled-out”?
a. Who received the training?
b. Was it mandatory?
5. Question: Is this training an original training or revised?
a. If revised, from where did this training come from?
6. Question: How is the professional development structured?
7. Question: Are participants able to practice with each other at the training?
8. Question: Are the trainers facilitating the practice?
9. Question: Did you base this training on any adult learning theories?
a. If so, which theory?
10. Question: How do schools know about this training?
11. Question: What demographic is this training geared towards?
12. Question: Is follow-up after the training built into the professional development?
13. Question: Who determines the type of follow-up or whether to provide follow
up?
14. Question: Who provides follow up?
15. Question: What kind of results do you expect?
16. Question: How will you know that teachers have learned the information that
you want them to learn?
149
APPENDIX C
CHALLENGING BEHAVIOR PROCESS TRAINING POWER POINT
Facilitator Notes:
Please make sure to have a manual present for facilitators and several copies for
review for participants. Also have blank BIRs available.
Everyone should have a copy of the flow chart.
150
To familiarize staff with the challenging behavior process
- documentation
- support planning
- staff roles
- purpose of meetings
- time frames
Purpose of Training
151
Definition of Challenging Behaviors that are Minor and Ongoing
• Any repeated pattern of behavior that interferes with learning or engagement
in pro-social interactions with peers and adults.
• Behaviors that are not responsive to the use of developmentally appropriate
guidance procedures is considered ongoing and minor.
Examples of Challenging Behaviors that are ongoing and minor:
• tantrums
• physical and verbal aggression NOTE: another child or adult has not been
injured
• disruptive vocal and motor behavior (e.g., screaming, stereotypy)
• Running away from adults, class or school
• noncompliance
• withdrawal
152
Rationale:
All challenging behaviors need to be documented using a Behavior Incident Report
(BIR). BIRs provide documentation of a particular child’s challenging behavior,
including the teacher’s initial efforts to support the child and how families are
informed. In particular, BIRs specifically ask for:
• Description of the challenging behavior
• Possible reasons for the behavior
• When the incident has occurred
• Where the incident has occurred
• How the staff/adult responded to the behavior
• Record of parent contact/discussion about the behavior
• A way to request for additional support
BIRs also provide data that can be used to inform and improve instructional practices
and program decisions at the classroom, site, regional, and division wide level
153
Behavior Incident Report (BIR) See Appendix A Is a form for recording
challenging behaviors Challenging behaviors can be defined as:
• Any repeated pattern of behavior that interferes with learning or engagement
in pro-social interactions with peers and adults.
• Any behavior that is a serious safety concern or is extremely disruptive to the
entire class
• Behaviors that are not responsive to the use of developmentally appropriate
guidance procedures.
154
Weekly Regional Implementation Team Meeting Purpose:
• Review BIRs
• Review DIAL-3 Supplemental Information Sheets
• Track all Intervention/Individualized Support Plans
• Coordinate meetings/family notifications
NOTE: teacher will hear back from the EC or other members of RIT regarding the
BIR and or next steps
155
Purpose:
To determine if behaviors are stemming from classroom practices or specific child
needs.
If the results turn out to be:
• classroom practice, will work with IS
• child behaviors, will work with RIT
156
Intervention Plan Meeting Purpose:
• Introduce family to PBS process
• Introduce family to implementation team
• Gather information about the child
• Discuss challenging behavior
• Brainstorm possible solutions
• Arrange logistics of follow up/review period
Family Meeting Procedures Overview:
• Introductions
• Meeting guidelines w/roles
• Child Strengths
• Challenging Behaviors
• Brainstorming possible reasons for behaviors
• Brainstorming possible solutions for reducing/eliminating behaviors
• Determining Roles and Responsibilities
• Follow –up
157
Filling out the Intervention Plan Form
Child Strengths
• Describing what we know about the child by all parties
Describing Challenging Behaviors
• From teachers
• From Optional Child Observation
• From families
Hypothesize the reason behind the behavior
Agree on what behaviors to concentrate on
Brainstorm possible solutions to reduce/eliminate behavior
• Determine which solutions are possible for home and classroom
Determining Actions
• Decide what, if any materials/resources will need to bought or made
• Decide what, if any other actions need to be done
Determining Roles & Responsibilities
• Decide who will be supporting what strategies and how often
Review the Plan
• Make sure everyone on the team understands the intervention plan and their roles
Acknowledgements
• Make sure everyone has an opportunity to ask questions and get clarifications as
needed.
158
Implementing and Monitoring the Plan
Plan Implementation
• Once the plan has been designed, a written copy should be disseminated to all team
members.
Fidelity Checklist
• Use the Fidelity Checklist form (appendix B)
• Followed up on by EC with scheduled dates for check ins
• Reviewed at RIT meeting
Monitoring Outcomes
• Plan for an initial observation at the beginning of implementation to ensure that staff
can implement plan components with fidelity.
• Schedule regular dates for check-ins
• Review the plan every two weeks
• Check for Plan Fidelity
• EC will monitor the implementation of plan using the fidelity checklist (Written
based on action plan) and reviewed at weekly RIT meetings.
• Teachers need to continue to fill out BIRs as they will be used to determine
improvement
159
If after two weeks, challenging behavior begins to decrease but is not completely
extinguished, continue for another 3 weeks or until behavior is completely
extinguished.
If challenging behavior continues
First,
• Review plan and make sure it is being implemented as planned.
• Review evaluation data to determine if the pattern is an extinction burst (worse
before it gets better).
• Examine events to see if there are new triggers for behavior.
Then,
• Restore support plan and implement with fidelity; or
• Continue plan through extinction burst; or
• Add components to plan to address new triggers; or
• Move into Individualized Support Planning
• Conduct functional behavior assessment
If Challenging behavior increases or stays the same
• Do the same as above for a total of two cycles (4 weeks)
• If after 2 cycles the behavior increases or stays the same, move to an Individual
Support Plan and begin functional behavior assessment immediately.
160
Definition of Challenging Behaviors that are serious safety concerns andextreme
• Any repeated pattern of behavior that interferes with learning or engagement in pro-
social interactions with peers and adults.
• Any serious safety behavior or disruptive to entire class requires immediate support.
• Consider frequency, intensity and duration.
Examples of Challenging Behaviors that are serious safety concerns:
• A severe and extremely prolonged tantrum (over 30 min.)
• Intentional physical aggression in which another child is injured, including biting,
bruising another person, small animals, etc.
• Climbing on high shelves
• Intentional property destruction
• Self-injury
• Inappropriate sexual behavior (must contact OC/EC immediately)
161
Immediate Support Guidelines Purpose:
• To have an emergency plan in place in case a challenging behavior poses a
health or safety risk.
There are three types of Immediate Support Guidelines based on the site facility.
• Site with Support Staff (appendix C)
• Single Classroom with No Support Staff Available (appendix C)
• Two Classrooms with No Support Available (appendix C)
Each site needs to have a plan that addresses:
• Who to call in case of a child with dangerous or overly disruptive behaviors.
• There should be several people listed in the order of who should be called
first, second, etc. In case the first person on the list can not be reached.
• Guidelines for what activity to do with the rest of the class and/or what
location to take the class to while getting the situation under control and still
maintain teacher: student ratio.
Plans should be reviewed with staff at each site yearly and changed if necessary.
162
When should BIRs be filled out for behaviors that are serious and safety
concerns?
• If the challenging behavior is a serious safety concern or is extremely
disruptive to the entire class, a BIR needs to be filled out immediately after
the first incident –regardless of the time of year
Examples of Challenging Behaviors that are serious safety concerns:
• Physical aggression in which another child is injured, including biting,
bruising another person, etc.
• A severe and extremely prolonged tantrum (over 30 min.)
• Behaviors that jeopardize personal safety of self and others (eg. climbing on
high shelves, running out of classroom or school)
• Property destruction
• Self-injury
• Inappropriate sexual behavior (must contact OC/EC immediately)
163
Immediate Regional Implementation Team Meeting Purpose:
Review BIRs for:
• Classroom trends
• Teacher Support Needs
• Family meeting notifications
• Coordination of Functional Behavior Assessment Process
164
Functional Behavior Assessment Process Purpose:
• Collect Data on Challenging Behavior by doing a Functional Behavior Assessment
• A process for developing an understanding of a person’s challenging behavior and,
in particular, how the behavior is governed by environmental events.
• Results in the identification of the “purpose” or “function” of the challenging
behavior.
• Develop Individualized Support Plan Draft
• Focused Observations
• Observe the child in target routines and settings.
• Collect data on child behavior, looking for situations that predict challenging
behavior and that are linked with appropriate behavior.
Everybody Helps
• Family collects data
• Educational staff collects data
• Outreach Counselors collect data
• Collect data in ALL settings
NOTE: We will do our best to include families in the process if that is not possible, the
process must still continue. The EC will let the parents know that we will continue to work
towards putting a support plan in place for the child at school.
165
Functional Behavior Assessment Process Forms
1. Classroom observation Form
2. Home observation form
3. Interview form for both families and teaching staff
4. Draft planning form for individualized support plan
166
Purpose of the Individualized Support Plan:
• Review Classroom/ Home Focused Observations
• Review Classroom/ Home Interviews
• Review and finalize Hypothesis and Individual Support Plan
• Develop an Action Plan
• Review fidelity of implementation plan
Family Meeting Procedures Overview:
• Introductions
• Meeting guidelines w/roles
• Results of Staff interviews
• Results of Family interview
• Review initial draft of Individualized Support plan
• Review/Brainstorming possible reasons for behaviors
• Review/Brainstorming possible solutions for reducing/eliminating behaviors
• Finalizing the Individualized Support plan
• Developing the Action plan
• Determining Roles and Responsibilities
• Setting dates for Follow –up
NOTE: We will do our best to include families in the process if that is not possible, the process must
still continue. The EC will let the parents know that we will continue to work towards putting a
support plan in place for the child at school.
167
Writing the Final Individualized Support Plan (appendix E):
• Must be linked to the functional assessment interviews and observations
• Must be tied to the function of the behavior
• Must “fit” with the abilities, routines, and values of caregivers
• Must have “buy-in” from the team
• Should include Safety Net Procedure, if necessary (see pages 28-29 for
procedures)
• If the child's behavior poses significant risk to self or others, the individual
support plan should include strategies to ensure safety and rapid de-escalation
of the crisis.
168
Things to consider when developing the action plan (appendix E)
• Develop plan using plain language.
• Develop mini-plans for difficult routines.
• Make sure plan will fit with routines/activities/values of family and teaching
staff.
• Develop action plan of who will produce what components needed to
implement the plan.
• Ask the following questions:
• What materials are needed to implement the plan? Are they readily available
or do they need to be made or purchased? If so, who will be responsible to do
so? And when does it have to be done by?
• When will each component of the plan be implemented, and who will be
responsible for initiating the use of each component (e.g. prevention
strategies, programs to teach replacement skills, response strategies, etc.)?
• Design components that are easy to use, easy to remember.
• Plan must accommodate competing demands on teaching staff and family.
169
Implementing and Monitoring the Plan
• Once the plan has been designed, a written copy should be disseminated to all team
members.
• Review strategies, demonstrate or guide, provide reinforcement
• Design supports that help the adult remember the plan
• Be cautious about extinction bursts – Explain to family and staff that often they will see a
behavior get worse before it gets better or that they might see a honeymoon period and then
the behavior returns. Consistency over time will be the key. Offer support, availability as
needed
• Make sure that staff/family understand that there will be a meeting in two weeks to evaluate
the implementation of support plan.
Monitoring Outcomes
• Plan for an initial observation at the beginning of implementation to ensure that staff can
implement plan components with fidelity.
• In addition, staff/family may benefit from a brief conversation (this could be by phone,
email, etc.) to discuss plan implementation and the child’s response. These conversations are
great opportunities to provide support and encouragement regarding efforts in plan
implementation.
• Schedule regular dates for check-ins
• Review the plan every two weeks
• Check for Plan Fidelity
170
If after two weeks, challenging behavior begins to decrease but is not completely extinguished,
inform families and then continue for another 2 weeks or until behavior is completely
extinguished.
If Challenging Behavior continues
• First, give teachers the Evaluating the Individualized Support Plan form (appendix E) to help
them:
• Review plan and make sure it is being implemented as planned.
• Review evaluation data to determine if the pattern is an extinction burst (worse before it gets
better).
• Examine events to see if there are new triggers for behavior.
Then,
• Restore support plan and implement with fidelity; or
• Continue plan through extinction burst; or
• Add components to plan to address new triggers; or
• Conduct a new functional assessment and develop new support strategies.
If Challenging behavior increases or stays the same
• Do the same as above for a total of two cycles (4 weeks)
• If after 2 cycles the behavior increases or stays the same the RIT will meet with teaching
staff and determine next steps.
• When all solutions have been exhausted, discussion of alternatives such as dis- enrollment of
child from the preschool program or transition to other programs will occur.
• NOTE: RIT can consult with Instructional Branch if they would like to
171
Appealing a Disciplinary Decision:
Disenrollment will be verbally communicated to the parent(s)/legal guardian(s) then followed-up with a letter.
Once the decision has been communicated, the parent(s)/legal guardian(s) have ten (10) working days from the
date of the letter to submit, in writing, an appeal to the Director of Community-Based Early Childhood
Education. Only disciplinary action decisions by regional management that result in disenrollment from the
Program may be appealed to the Director of CBECE.
All other disciplinary decisions by regional management that result in consequences other than release from the
program may not be appealed.
Appeals must be based on one or more of the following specific factors:
• The presentation of new information regarding the situation that was not previously available during
the original investigation of the inappropriate behavior.
• A potential lapse or error in applying the school’s procedures applicable to the situation based upon the
school’s disciplinary process.
Once the written appeal has been received, the Director of CBECE will review the information gathered by the
regional management.
• If there are insufficient grounds for the appeal, the Director will notify the parent(s)/legal guardian(s) in
writing.
• If the Director determines there are sufficient grounds for the appeal, time will be scheduled for the
Director to meet with the parent(s)/legal guardian(s) and, if deemed necessary by the Director, with the
child and/or regional management. After the scheduled meeting, the Director will render a final
decision of the appeal in writing to the family and regional management as soon as practical. The
decision by the Director is final and not reviewable.
172
Definition:
A resource tool to help teams determine which route to take with. A standardized checklist
for the regional implementation teams with criteria for determining what supports individual
children/teachers/families will need:
Use the individualized support plan for children with:
• 3 BIRs with Health & Safety Concerns
• Combination of 3 BIRs with Minor and Ongoing concerns and Health and Safety
concerns including:
• 2 Minor and ongoing BIR and 1 Health and Safety
• 2 health and Safety and 1 Minor and ongoing BIR
Use the intervention plan for children with:
• 3 BIRs with Minor and ongoing concerns
If only developmental skills are of concern, make a referral for Developmental assessment
If only family matters are of concern, develop a family support plan
If there are multiple concerns, use any combination above as appropriate
173
Break up into small groups or partners, read the scenario and determine what the next step
would be and share back.
Makena
• write up a BIR because it is a heath and safety issue because it was a scratch to
another child.
• teacher will inform both parents about the incident
• include the accident or ouwie report for injured child
• send the BIR to the EC within 24 hours with the ouwie report
• help Makena by reviewing classroom rules/division-wide expectations
• continue to monitor and write additional BIRs if necessary
Kade
• it would depend on what number incident this is
• if it is the third incident then write a BIR
• review classroom rules and division-wide expectations
• teacher to work on problem solving with children
Have BIRs available for groups to fill out as a part of the exercise
174
Break up into small groups or partners, read the scenario and determine what the
next step would be and share back.
Mary
• being withdrawn is a challenging behavior and if it is consistent and ongoing
a BIR should be filled out
• talk to parent after BIR submittal
• possible intervention plan after talking to teaching staff and mom
Kaleo
• withdrawn is a challenging behavior
• if consistent write a BIR after 3rd incident
• Ask for additional support
• May go intervention plan route
• If on IP and is not working after 2 cycles (4 weeks) then move to ISP
175
APPENDIX D
PRE-TEST SURVEY
1. Introduction and Welcome
Research suggests that effective professional development is a key piece in
providing teachers with the tools and strategies for improving outcomes for students.
We are interested in learning more about your experience with the PBS Challenging
Behavior Training. We appreciate hearing your views.
Your responses are anonymous and there is no way for the researchers to link your
identity to these responses. The information below may help us to understand how
teachers' perceptions may vary. Knowing more about your views and experience
can help strengthen training and resources for teachers on this very important
subject.
By completing this survey, you agree to be a participant. You may quit the survey at
any time or skip any question you do not wish to answer. Your participation is
voluntary.
Thank you for participating!
2. Participant ID Number
You will be asked to respond to a number of surveys over the course of the next few
months. A unique participant ID will be used in order to maintain your anonymity
while matching your responses on each survey.
1. In the box below please enter the year of your birth followed by the last four
digits of your phone number. (YYYYPPPP). For example, if your birth year
is 1976 and the last four digits of your phone number are 1234, please enter
19761234.
_____________________________
176
3. Demographic Information
The following questions will ask for demographic information. Information about the
qualifications and experience of teachers can be used to better tailor professional
development opportunities.
1. What is your job title?
• P3 Teacher
• P4 Teacher
• P3 Teaching Assistant
• P4 Teaching Assistant
• Extended Day Teacher
• Extended Day Teaching Assistant
2. What region are you in?
• Honolulu
• Koolau I
• Koolau II
• Waianae I
• Waianae II
3. How many years have you worked in your current job?
Number of Years _______________
4. How many years have you worked in early childhood settings (children from
birth to age 5)?
Years of Experience ________________________
5. What is your highest level of education?
• High School Diploma
• Some college
• CDA
• AA/AS Degree
• 4 Year Degree
• Masters Degree
• Doctorate
177
6. If you have a degree, in what field did you complete the degree?
ECE Education Other
AA/AS Degree
4 Year Degree
Masters Degree
Doctorate
178
4. Teacher Beliefs
This questionnaire is designed to help us gain a better understanding of the kinds of
things that create challenges for teachers. Your answers are confidential.
Directions: Please indicate your opinion about each of the questions below by
marking any one of the nine responses in the columns on the right side, ranging from
(1) “None at all” to (9) “A Great Deal” as each represents a degree on the continuum.
Please respond to each of the questions by considering the combination of your
current ability, resources, and opportunity to do each of the following in your present
position
1 - None at all
2
3 - Very Little
4
5 – Some Influence
6
7 – Quite a Bit
8
9 - A Great Deal
1
How much can you do to control disruptive
behavior in the classroom?
2
How much can you do to motivate students
who show low interest in school work?
3
How much can you do to calm a student
who is disruptive or noisy?
4
How much can you do to help your students
value learning?
5
To what extent can you craft good questions
for your students?
6
How much can you do to get children to
follow classroom rules?
7
How much can you do to get students to
believe they can do well in school work?
8
How well can you establish a classroom
management system with each group of
students?
9
To what extent can you use a variety of
assessment strategies?
10
To what extent can you provide an
alternative explanation or example when
students are confused?
11
How much can you assist families in
helping their children do well in school?
12
How well can you implement alternative
teaching strategies in your classroom?
179
5. Challenging Behavior Process
1. What form is used to document the occurrence of challenging behaviors in
the classroom?
• Intervention Plan
• Behavior Incident Report (BIR)
• Individual Support Plan
• Referral Form
2. Who may be on the Regional Implementation Team (RIT) (check all that
apply)
• Educational Coordinator (EC)
• Assistant Educational Coordinator (AEC)
• Instructional Specialist (IS)
• Outreach Counselor (OC)
• Classroom Teachers
• PBS Leadership Team Representative
3. If an EC observes that challenging behaviors stem from classroom practices,
the next step in the CB Process is:
• Instructional Specialist (IS) works with Teaching Team to modify
classroom practices
• Regional Implementation Team (RIT) works with Teaching Team to
modify classroom practices
• An Intervention Plan Meeting is scheduled
• Teacher stops filling out Behavior Incident Reports (BIRs)
4. Which is NOT one of the purposes of the Intervention Plan Meeting?
• To respond to challenging behaviors that are minor and ongoing
• To respond to challenging behaviors that are serious safety concerns and
extreme
• Introduce the family to the PBS process
• To gather information about the child
5. Which of the following is used to address challenging behaviors that are
serious safety concerns and extreme? (check all that apply)
• Immediate Support Guidelines
• Behavioral Incident Report (BIR)
• Immediate Regional Implementation Team (RIT) meeting
• Functional Behavior Assessment (FBA)
• Individualized Support Plan
180
• Intervention Plan
• DIAL – R
6. What is the purpose of the Functional Behavior Assessment?
• To collect data about challenging behaviors
• To understand how the challenging behaviors are governed by
environmental events
• To develop and Individualized Support Plan Draft
• All of the Above
7. Who participates in a Functional Behavior Assessment?
• Teacher
• Teaching Assistant
• Families
• Regional Implementation Team
• All of the Above
8. You have implemented an Intervention Plan for two weeks. The
challenging behavior has continued with no sign of decreasing. The
Intervention Plan was implemented with fidelity, there was no “extinction
burst”, and there are no new triggers. What are your next steps?
• Move to an Individual Support Plan and begin Functional Behavior
Assessment immediately.
• Continue to repeat the Intervention Plan until the behavior is completely
extinguished.
• Repeat the Intervention Plan for a total of two cycles (4 weeks), if after
two cycles the behavior increases or stays the same, move to an
Individual Support Plan and begin Functional Behavior Assessment
immediately.
• Discuss alternatives such as disenrollment of child from the preschool
program or transition to other programs.
9. You have implemented an Individualized Support Plan for two weeks. The
challenging behavior has continued without decreasing. What are your next
steps?
• Move to an Intervention Plan.
• Review plan and make sure it is being implemented as planned; Review
evaluation data to determine if the pattern is an extinction burst; Examine
events to see if there are new triggers for behavior.
181
• The plan is not working. Create a new Individualized Support Plan and
implement for two weeks.
• Discuss alternatives such as disenrollment of child from the preschool
program or transition to other programs.
10. You have implemented an Intervention Plan for two weeks. The
challenging behavior begins to decrease but is not completely extinguished.
What are your next steps?
• Repeat the Intervention Plan for a total of two cycles (4 weeks), if after
two cycles the behavior increases or stays the same, move to an
Individual Support Plan and begin Functional Behavior Assessment
immediately.
• Move to an Individual Support Plan and begin Functional Behavior
Assessment immediately.
• Continue the Individual Support Plan for another 3 weeks or until the
behavior is completely extinguished.
• Stop the plan and continue to document challenging behaviors using the
Behavioral Incident Report (BIR).
7. Thank You
Thank you for taking the time to complete this survey. Your responses will help to
strengthen training and resources for teachers in the future.
182
APPENDIX E
POST-TEST SURVEY
1. Introduction and Welcome
Research suggests that effective professional development is a key piece in
providing teachers with the tools and strategies for improving outcomes for students.
We are interested in learning more about your experience with the PBS Challenging
Behavior Training. We appreciate hearing your views.
Your responses are anonymous and there is no way for the researchers to link your
identity to these responses. The information below may help us to understand how
teachers' perceptions may vary. Knowing more about your views and experience
can help strengthen training and resources for teachers on this very important
subject.
By completing this survey, you agree to be a participant. You may quit the survey at
any time or skip any question you do not wish to answer. Your participation is
voluntary.
Thank you for participating!
2. Participant ID Number
You will be asked to respond to a number of surveys over the course of the next few
months. A unique participant ID will be used in order to maintain your anonymity
while matching your responses on each survey.
1. In the box below please enter the year of your birth followed by the last four
digits of your phone number. (YYYYPPPP). For example, if your birth year
is 1976 and the last four digits of your phone number are 1234, please enter
19761234.
_____________________________
183
3. Teacher Response
The following questions ask you to provide your feedback and opinions about the
Challenging Behavior Process training. Your answers will help to inform future
professional development trainings.
1. To what extent was the professional development activity
Not
at All
To Some
Degree
Quite a
Bit
Completely
Consistent with your own goals for
professional development?
Consistent with your region's /
division's plan to change practice?
Based explicitly on what you had
learned in earlier professional
development experiences?
Designed to support Preschool
licensing and curriculum standards?
Designed to support KS-CBECE
assessments?
Please add any additional comments about how the CB Process training is congruent
with your own goals for professional development and/or KS-CBECE preschool
goals and standards.
________________________________________________________________________
184
Please indicate how much you agree with the following statements:
Strongly
Disagree
Disagree Agree
Strongly
Agree
The ideas and practices taught were
new to me
The ideas and practices taught are
relevant to my professional practice
The quality of information
presented was sound (i.e., reflects
best practices)
The information was conveyed in an
effective manner
I am likely to implement the ideas
and practices taught in my
professional practice
If I implement the ideas and
practices taught, my student will
likely benefit
I feel confident in my ability to
implement the ideas and practices
taught
Please add any additional comments about the effectiveness of the CB Process
training.
____________________________________________________________
2. I know how the research about challenging behaviors is connected to my
instructional practice.
• Yes
• No
185
3. What do you need next to apply what you learned in the classroom?
Please check all that apply.
• More time to learn the Challenging Behavior Process
• Practice under simulated conditions, with feedback
• Coaching, either by trainers or instructional coaches, in the classroom
with feedback
• Opportunities to collaborate/discuss with other teachers
• Additional planning time
• More research based evidence that the Challenging Behavior Process
training is relevant to my teaching and classroom
• Other (please specify)
4. I expect the following support to occur after this training.
Please check all that apply.
• Nothing
• Follow up/Coaching by the trainers
• Follow up/Coaching by the instructional specialist at my school
• Follow up/Coaching by the outreach counselor at my school
• Follow up/Coaching by my peers who also attended the training
• Follow up/Coaching by my administrator
5. When thinking about professional development in general, how important to
you are the following?
Not at
All
A Little
Important
Important
Very
Important
Coaching
Feedback
Peer Mentoring
Administrative Support
Collegial Support
Follow-up training by
trainers
186
4. Teacher Beliefs
This questionnaire is designed to help us gain a better understanding of the kinds of
things that create challenges for teachers. Your answers are confidential.
Directions: Please indicate your opinion about each of the questions below by
marking any one of the nine responses in the columns on the right side, ranging from
(1) “None at all” to (9) “A Great Deal” as each represents a degree on the continuum.
Please respond to each of the questions by considering the combination of your
current ability, resources, and opportunity to do each of the following in your present
position
1- None at all
2
3 - Very Little
4
5 – Some Influence
6
7 – Quite a Bit
8
9 - A Great Deal
1
How much can you do to control disruptive
behavior in the classroom?
2
How much can you do to motivate students
who show low interest in school work?
3
How much can you do to calm a student
who is disruptive or noisy?
4
How much can you do to help your students
value learning?
5
To what extent can you craft good questions
for your students?
6
How much can you do to get children to
follow classroom rules?
7
How much can you do to get students to
believe they can do well in school work?
8
How well can you establish a classroom
management system with each group of
students?
9
To what extent can you use a variety of
assessment strategies?
10
To what extent can you provide an
alternative explanation or example when
students are confused?
11
How much can you assist families in
helping their children do well in school?
12
How well can you implement alternative
teaching strategies in your classroom?
187
5. Challenging Behavior Process
1. What form is used to document the occurrence of challenging behaviors in
the classroom?
• Intervention Plan
• Behavior Incident Report (BIR)
• Individual Support Plan
• Referral Form
2. Who may be on the Regional Implementation Team (RIT) (check all that
apply)
• Educational Coordinator (EC)
• Assistant Educational Coordinator (AEC)
• Instructional Specialist (IS)
• Outreach Counselor (OC)
• Classroom Teachers
• PBS Leadership Team Representative
3. If an EC observes that challenging behaviors stem from classroom practices,
the next step in the CB Process is:
• Instructional Specialist (IS) works with Teaching Team to modify
classroom practices
• Regional Implementation Team (RIT) works with Teaching Team to
modify classroom practices
• An Intervention Plan Meeting is scheduled
• Teacher stops filling out Behavior Incident Reports (BIRs)
4. Which is NOT one of the purposes of the Intervention Plan Meeting?
• To respond to challenging behaviors that are minor and ongoing
• To respond to challenging behaviors that are serious safety concerns and
extreme
• Introduce the family to the PBS process
• To gather information about the child
5. Which of the following is used to address challenging behaviors that are
serious safety concerns and extreme? (check all that apply)
• Immediate Support Guidelines
• Behavioral Incident Report (BIR)
• Immediate Regional Implementation Team (RIT) meeting
• Functional Behavior Assessment (FBA)
• Individualized Support Plan
188
• Intervention Plan
• DIAL – R
6. What is the purpose of the Functional Behavior Assessment?
• To collect data about challenging behaviors
• To understand how the challenging behaviors are governed by
environmental events
• To develop and Individualized Support Plan Draft
• All of the Above
7. Who participates in a Functional Behavior Assessment?
• Teacher
• Teaching Assistant
• Families
• Regional Implementation Team
• All of the Above
8. You have implemented an Intervention Plan for two weeks. The
challenging behavior has continued with no sign of decreasing. The
Intervention Plan was implemented with fidelity, there was no “extinction
burst”, and there are no new triggers. What are your next steps?
• Move to an Individual Support Plan and begin Functional Behavior
Assessment immediately.
• Continue to repeat the Intervention Plan until the behavior is completely
extinguished.
• Repeat the Intervention Plan for a total of two cycles (4 weeks), if after
two cycles the behavior increases or stays the same, move to an
Individual Support Plan and begin Functional Behavior Assessment
immediately.
• Discuss alternatives such as disenrollment of child from the preschool
program or transition to other programs.
9. You have implemented an Individualized Support Plan for two weeks. The
challenging behavior has continued without decreasing. What are your next
steps?
• Move to an Intervention Plan.
• Review plan and make sure it is being implemented as planned; Review
evaluation data to determine if the pattern is an extinction burst; Examine
events to see if there are new triggers for behavior.
189
• The plan is not working. Create a new Individualized Support Plan and
implement for two weeks.
• Discuss alternatives such as disenrollment of child from the preschool
program or transition to other programs.
10. You have implemented an Intervention Plan for two weeks. The
challenging behavior begins to decrease but is not completely extinguished.
What are your next steps?
• Repeat the Intervention Plan for a total of two cycles (4 weeks), if after
two cycles the behavior increases or stays the same, move to an
Individual Support Plan and begin Functional Behavior Assessment
immediately.
• Move to an Individual Support Plan and begin Functional Behavior
Assessment immediately.
• Continue the Individual Support Plan for another 3 weeks or until the
behavior is completely extinguished.
• Stop the plan and continue to document challenging behaviors using the
Behavioral Incident Report (BIR).
7. Thank You
Thank you for taking the time to complete this survey. Your responses will help to
strengthen training and resources for teachers in the future.
190
APPENDIX F
FOLLOW-UP SURVEY
1. Introduction and Welcome
Research suggests that effective professional development is a key piece in
providing teachers with the tools and strategies for improving outcomes for students.
We are interested in learning more about your experience with the PBS Challenging
Behavior Training. We appreciate hearing your views.
Your responses are anonymous and there is no way for the researchers to link your
identity to these responses. The information below may help us to understand how
teachers' perceptions may vary. Knowing more about your views and experience
can help strengthen training and resources for teachers on this very important
subject.
By completing this survey, you agree to be a participant. You may quit the survey at
any time or skip any question you do not wish to answer. Your participation is
voluntary.
Thank you for participating!
2. Participant ID Number
You will be asked to respond to a number of surveys over the course of the next few
months. A unique participant ID will be used in order to maintain your anonymity
while matching your responses on each survey.
1. In the box below please enter the year of your birth followed by the last four
digits of your phone number. (YYYYPPPP). For example, if your birth year
is 1976 and the last four digits of your phone number are 1234, please enter
19761234.
_____________________________
191
3. Introduction
You have recently utilized the Challenging Behavior Process with one of your
students. The following questions ask you to provide your feedback and
opinions about how the Challenging Behavior Process training prepared you for
your experiences. Your answers will help to inform future professional
development trainings.
4. Teacher Response
Now that you have taken the Challenging Behavior Process training and have had
time to implement your new skills and knowledge, please answer the following
questions to the best of your ability.
1. Please indicate how much you agree with the following statements
Strongly
Disagree
Disagree Agree
Strongly
Agree
I have been able to successfully
implement the major objectives taught
in the Challenging Behavior Process
training.
As a result of implementing the
objectives of the Challenging Behavior
Process training, I have observed a
positive impact on my general
education students.
As a result of implementing the
objectives of the Challenging Behavior
Process training, I have observed a
positive impact on my students with
challenging behaviors.
I consider the changes in my teaching
and or student outcomes as a result of
implementing the objectives of the
Challenging Behavior Process training
important and valuable
192
Please add any additional comments about your experience implementing what
you learned while implementing the CB Process training objectives.
________________________________________________________________________
2. To what extent was the professional development activity
Not
at All
To Some
Degree
Quite a
Bit
Completely
Consistent with your own goals for
professional development?
Consistent with your region's /
division's plan to change practice?
Based explicitly on what you had
learned in earlier professional
development experiences?
Designed to support Preschool
licensing and curriculum standards?
Designed to support KS-CBECE
assessments?
Please add any additional comments about how the CB Process training is congruent
with your own goals for professional development and/or KS-CBECE preschool
goals and standards.
________________________________________________________________________
3. I know how the research about challenging behaviors is connected to my
instructional practice.
• Yes
• No
193
5. Challenging Behavior Process
We are interested in finding out what type of activities and supports are available to
help teachers understand the Challenging Behavior Process. Now that you have
taken the Challenging Behavior Process training and have had time to implement
your new skills and knowledge, please answer the following questions to the best of
your ability.
1. Did you receive support, either from peers, administration, or the trainers to
help implement the skills and knowledge learned from the Challenging
Behavior Process training?
• Yes
• No
2. How did this professional development activity help you to implement the
Challenging Behavior Process? Check all that apply.
• Practiced under simulated conditions, with feedback
• Received coaching or mentoring in the classroom
• Met formally with other activity participants to discuss implementation
• My teaching was observed by my EC/AEC and feedback was provided
• My teaching was observed by other participants and feedback was
provided (e.g., discuss whether to write BIR)
• Communicated with my EC/AE concerning classroom implementation
• Communicated with my Instructional Specialist concerning classroom
implementation
• Communicated with my Outreach Counselor concerning classroom
implementation
• Met informally with other participants to discuss classroom
implementation
• Other (please specify)
3. If you received support, how often was it provided?
• 1 time
• 2 times
• 3 times
• 4 times
• 5 times
• 6 or more times
4. Do you feel that you received enough support?
• Yes
• No
194
5. If you answered "No" to Question 5, did you need more or less support AND
why?
6. If you received support for this professional development activity, what was
the most helpful about the support?
7. If you received support for this professional development activity, what was
the least helpful about the support?
8. Now that you have had the opportunity to implement the new skills and
knowledge from the Challenging Behavior Process training, how important to
you are the following for implementation purposes.
Not at All
A Little
Important Important
Very
Important
Coaching
Feedback
Peer Mentoring
Administrative Support
Collegial Support
Follow-up training by
trainers
195
6. Challenging Behavior Process
1. Have any of the following issues arisen in your efforts to introduce changes
in your teaching based on your experience in the professional development
activity?
Issue
did not
arise
Issue arose
- Minor
problem
Issue arose -
Moderate
problem
Issue arose
- Major
problem
Insufficient planning time
Inadequate classroom
resources
Resistance from other teachers
Resistance from
administrators
Resistance from parents
Class size too large to
implement changes
Conflict between changes and
needs of my students
Conflict between changes and
KS-CBECE assessments
Conflict between changes and
KS-CBECE curriculum
frameworks/content standards
Conflict between changes and
other reform efforts
Insufficient opportunity to
practice new skills
Other (Please specify below)
196
7. Teacher Beliefs
This questionnaire is designed to help us gain a better understanding of the kinds of
things that create challenges for teachers. Your answers are confidential.
Directions: Please indicate your opinion about each of the questions below by
marking any one of the nine responses in the columns on the right side, ranging from
(1) “None at all” to (9) “A Great Deal” as each represents a degree on the continuum.
Please respond to each of the questions by considering the combination of your
current ability, resources, and opportunity to do each of the following in your present
position
1 - None at all
2
3 - Very Little
4
5 – Some Influence
6
7 – Quite a Bit
8
9 - A Great Deal
1
How much can you do to control disruptive
behavior in the classroom?
2
How much can you do to motivate students
who show low interest in school work?
3
How much can you do to calm a student
who is disruptive or noisy?
4
How much can you do to help your students
value learning?
5
To what extent can you craft good questions
for your students?
6
How much can you do to get children to
follow classroom rules?
7
How much can you do to get students to
believe they can do well in school work?
8
How well can you establish a classroom
management system with each group of
students?
9
To what extent can you use a variety of
assessment strategies?
10
To what extent can you provide an
alternative explanation or example when
students are confused?
11
How much can you assist families in
helping their children do well in school?
12
How well can you implement alternative
teaching strategies in your classroom?
197
7. Thank You
Thank you for taking the time to complete this survey. Your responses will help to
strengthen training and resources for teachers in the future.
Abstract (if available)
Abstract
Although professional development is an important means of improving both teachers’ skills and student outcomes, there is a dearth of high quality empirical research on the efficacy of such efforts. The efficacy of the Challenging Behavior Process was assessed using a mixed method approach which included the use of pre-, post-, and follow-up surveys. The participants were preschool teachers who worked for Kamehameha Schools on the island of Oahu. The relationship of the training to teacher efficacy, learning, and transfer was assessed. The analyses determined that the training was well-designed to meet the needs of the teaching staff, participants reacted positively to the training, and participation in the training was related to positive changes in knowledge about the CB process. However, participation in the training was also related to a statistically significant decrease in teacher efficacy for teaching strategies at the time of the follow-up and a number of knowledge gaps were uncovered. Finally, the study revealed a reporting rate for challenging behaviors that was much lower than expected. The overarching implication drawn from this study is that comprehensive evaluation of professional development in education is both necessary and valuable. It is not enough to collect data about how an initiative is working. It is important to take time to put the pieces of the professional development puzzle together to determine how and if professional development efforts create or support change in the classroom.
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Teacher perception on coaching and effective professional development implementation
PDF
Enhancing professional development aimed at changing teachers' perceptions of Micronesian students
PDF
Efficacy drivers that aid teacher professional development transfer
PDF
What is the relationship between early childhood teachers' training on the development of their teaching self-efficacy?
PDF
Raising cultural self-efficacy among faculty and staff of a private native Hawaiian school system
PDF
Professional development: a six-year data evaluation of HIDTA law enforecement task force training programs
PDF
Narrowing the English learner achievement gap through teacher professional learning and cultural proficiency: an evaluation study
PDF
Support for English learners: an examination of the impact of teacher education and professional development on teacher efficacy and English language instruction
PDF
Professional development for teaching online
PDF
Teachers as bystanders: the effect of teachers’ perceptions on reporting bullying behavior
PDF
Professional learning communities: the role of school principals, district directors, assistant superintendents, and superintendents in developing collective efficacy in public secondary school...
PDF
How effective professional development in differentiated instruction can save Hawaii's Catholic schools
PDF
Developing a computer science education program: an innovation study
PDF
Application of professional learning outcomes into the classroom: an evaluation study
PDF
An examination of prospective teacher qualities related to teacher efficacy and teacher commitment
PDF
Effective professional development strategies to support the advancement of women into senior student affairs officer positions
PDF
Perception and use of instructional technology: teacher candidates as adopters of innovation
PDF
Effects of individualized professional development on the theoretical understandings and instructional practices of teachers
PDF
Improving early grade reading instruction in Ghana: a discrepancy gap analysis
PDF
Supporting the professional development of early childhood educators: a case study of an emergent literacy intervention project
Asset Metadata
Creator
Tomonari, Dana Anne Miyuki
(author)
Core Title
A professional development program evaluation: teacher efficacy, learning, and transfer
School
Rossier School of Education
Degree
Doctor of Education
Degree Program
Education (Leadership)
Publication Date
03/21/2012
Defense Date
02/24/2012
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
Early Childhood Education,Education,evaluation,OAI-PMH Harvest,professional development
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Sundt, Melora A. (
committee chair
), Brewer, Dominic J. (
committee member
), Cole, Darnell (
committee member
)
Creator Email
dtomonari@gmail.com,tomonari@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c127-677328
Unique identifier
UC1344766
Identifier
usctheses-c127-677328 (legacy record id)
Legacy Identifier
etd-TomonariDa-536.pdf
Dmrecord
677328
Document Type
Dissertation
Rights
Tomonari, Dana Anne Miyuki
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Repository Email
cisadmin@lib.usc.edu
Tags
evaluation
professional development