Close
About
FAQ
Home
Collections
Login
USC Login
Register
0
Selected
Invert selection
Deselect all
Deselect all
Click here to refresh results
Click here to refresh results
USC
/
Digital Library
/
University of Southern California Dissertations and Theses
/
Training staff to implement the interview-informed synthesized contingency analysis (IISCA)
(USC Thesis Other)
Training staff to implement the interview-informed synthesized contingency analysis (IISCA)
PDF
Download
Share
Open document
Flip pages
Contact Us
Contact Us
Copy asset link
Request this asset
Transcript (if available)
Content
Running head: STAFF TRAINING ON THE IISCA
TRAINING STAFF TO IMPLEMENT THE INTERVIEW-INFORMED SYNTHESIZED
CONTINGENCY ANALYSIS (IISCA)
by
Vincent Campbell
___________________________________________________________
A Thesis Presented to the
Faculty of the USC Dornsife College of Letters, Arts and Sciences
University of Southern California
In Partial Fulfillment of the requirement for the degree
Master of Science
(APPLIED BEHAVIOR ANALYSIS)
December 2019
STAFF TRAINING ON THE IISCA ii
Table of Contents
Abstract iii
Introduction 1
Methods
Participants and Settings 8
Experimental Design 10
Measurement 10
Procedures 12
Social Validity 16
Results
Training 17
Trainee Implementation of IISCA with Children with Autism 18
Social Validity 18
Discussion 20
References 24
Figures
Figure 1: Task analysis data sheet 28
Figure 2: Task analysis data sheet 29
Figure 3: Open-ended functional assessment interview 30
Figure 4: Open-ended functional assessment interview 31
Figure 5: Open-ended functional assessment interview 32
Figure 6: IISCA checklist 33
Figure 7: Assessment Questionnaire 34
Figure 8: Assessment Questionnaire 35
Figure 9: Concurrent multiple-probe graph across participants 36
Figure 10: IISCA for King and James 37
Figure 11: Change in Finley and Baylor’s social validity scores 38
STAFF TRAINING ON THE IISCA iii
Abstract
Hanley et al. (2014) developed the interview-informed synthesized contingency analysis
(IISCA), which provided a rapid method for experimentally assessing challenging behavior,
using synthesized conditions and precursor behaviors. No published research to date has
evaluated procedures for training the IISCA to practicing professionals. In this study, three
individuals volunteered for a training on IISCA methodology. After receiving training, all
participants demonstrated high procedural fidelity while conducting the IISCA. Additionally,
supplemental maintenance probes and generalization probes with actual clients demonstrated that
participants maintained procedural high fidelity. These results suggest that the IISCA is likely a
trainable assessment for practitioners and that training can be relatively rapid and therefore
efficient.
STAFF TRAINING ON THE IISCA 1
Training Staff to Implement the Interview-Informed Synthesized Contingency Analysis (IISCA)
Applied Behavior Analysis (ABA) is abed upon determinism, which is the assumption
that all behavior is orderly caused by the organism’s learning history and current environment.
Behavior analysts study how the environment affects behavior by measuring motivating
operations, antecedent stimuli, and maintaining consequences. Edward Carr's (1977) study was
among the first to address the hypothesized causes of self-injurious behavior (SIB) from a
functional analytic perspective. Carr’s early analysis suggested that SIB may be maintained by
the following three, empirically supported causes: 1) SIB is a learned behavior, maintained by
positive reinforcement; 2) SIB is a learned behavior, maintained by negative reinforcement; and
3) SIB is a behavior that produces self-stimulation. Further, Carr’s analysis endorsed the use of
function-based treatment when addressing challenging behaviors and suggested the development
of a function-based assessment to help identify possible maintaining consequences.
Iwata et al. (1982/1994) developed the first formalized, experimental functional analysis.
During the analysis, researchers recruited nine individuals who engaged in some form of SIB.
Researchers repeatedly exposed the participants to a randomized series of four different analog
conditions. The first condition, often referred to as the escape from demands condition, consisted
of researchers delivering demands to the participants. The researchers ceased the delivery of
demands contingent on the participant engaging in SIB. The second condition, often referred to
as the access to attention condition, consisted of researchers removing attention from the
participant. Participants only received attention contingent on engaging in SIB. The third
condition, often referred to as the control condition, consisted of researchers providing
noncontingent access to praise and preferred play items. The final condition, often referred to as
the alone condition, consisted of the participant alone in an empty room. Throughout all
Running head: STAFF TRAINING ON THE IISCA 2
assessment conditions, researchers recorded the occurrence and nonoccurrence of SIB during
continuous 10 s intervals. The results of repeated exposures to each condition indicated
differentiated patterns of responding between participants and across each condition. Based on
these results, researchers suggested that the single condition, experimental methodology is a
viable assessment to indicate the maintaining consequence of SIB.
Since the formalized introduction of the single condition functional analysis (Iwata et al.,
1982/1994), researchers have produced hundreds of successful replications of Iwata et al.
(1982/1994) methodology (Hanley, Iwata, & McCord, 2003; Beavers, Iwata, & Lerman, 2013).
Additionally, researchers have developed variations to single condition functional analysis.
Northup et al. (1991) first attempted to shorten the single condition function analysis, in the
development of the brief functional analysis. Using the brief functional analysis, researchers
demonstrated that limited exposures to single condition sessions produced differentiated
responding between conditions, for a minority of participants (Derby et al., 1992; Wilder,
Masuda, O’Connor, & Baham, 2001; & Wilder et al., 2006). Next, Sigafoos and Saggers (1995)
developed the trial-based functional analysis. The trial-based functional analysis allowed
researchers to embed functional analysis conditions within an individual’s natural environment
and demonstrate differentiated responding between single-condition trials (Bloom, Iwata, Fritz,
Roscoe, & Carreau, 2011; Rispoli, Ninci, Neely, & Zaini, 2014). Others have assessed the
maintaining consequence of behaviors using latency-based functional analyses (Thomason-Sassi,
Iwata, Neidert, & Roscoe, 2011). Using this latency-based functional analysis, researchers
compared the latency from the onset of the single condition to the occurrence of the target
behavior. Researchers then compared the latencies between the conditions and differentiation
between latencies across conditions can reveal the function of behavior (Lambert et al., 2017).
Running head: STAFF TRAINING ON THE IISCA 3
Borrero and Borrero (2008) modified the single condition function analysis by assessing
precursor behaviors within the standard single condition functional analysis. The results of
Borrero and Borrero’s (2008) research confirmed that measuring precursor behaviors was
effective in both identifying the maintaining function and minimizing challenging behavior.
While a considerable number of the functional analyses described above measure
challenging behaviors by isolating potential maintaining consequences (e.g., escape), Beavers et
al. (2013) acknowledged a potential limitation to these procedures. According to Beavers et al.
(2013), a growing number of single condition functional assessments identify undifferentiated or
multiply-controlled consequences for challenging behaviors. To capture the occurrence of
challenging behavior under multiply-controlled consequences, researchers have since adapted the
single condition functional analysis to include synthesized conditions (Hagopian, Bruzek,
Bowman, and Jennett, 2007). Hanley (2012) first proposed a formal framework for assessing
behaviors based on the results of an open-ended interview and supplemental testing of
synthesized consequences (e.g., escape a demand to receive access to a toy).
Hanley et al. (2014) first developed the interview-informed synthesized contingency
analysis (IISCA). Three participating families volunteered their child to participate in the IISCA,
all of whom engaged in some form of challenging behavior. For all participants, the IISCA
consisted of an open-ended interview, direct observation, and experimental synthesized
contingency functional analysis. The open-ended interview lasted between 30 and 45 min and
contained questions that informed the researcher about the topography of both precursor
behaviors and challenging behavior, the context in which the challenging behavior is most likely
to be observed, previous interventions used to correct the challenging behavior, and the
participant’s general skills. The direct observation lasted between 15 and 30 min and allowed
Running head: STAFF TRAINING ON THE IISCA 4
researchers to note any antecedents or consequences that may influence the occurrence of the
challenging behavior. Finally, the functional analysis consisted of a randomized series of test and
control conditions. The interview and observation informed the arrangement of each condition.
During the control condition, the hypothesized reinforcers were available
noncontingently. During the test condition, the researcher withheld the hypothesized reinforcers
and delivered them for 30 seconds contingent on challenging behavior. The results of the IISCA
indicated differentiated responding between the test and control conditions for each participant.
Researchers then treated each topography of challenging behavior by developing individualized
treatment plans for each participant, based on the results of the IISCA. Treatment plans involved
the researchers teaching a simple functional communication response (FCR), a complex FCR,
and finally a denial- and delay-tolerance response. Simple FCR involved initially teaching a
simple response (e.g., “my way, please) using behavior skills training (BST). After each
participant independently emitted a simple response across two consecutive sessions, the
researchers increased the complexity of the response. During the complex response phase,
participants were first required to make a bid for the researcher’s attention (e.g., “Excuse me.”),
wait for the researcher to acknowledge the participant’s response, and provide the simple
response. Finally, during the denial- and delay-tolerance response training phase, participants
would perform the complex response, and researchers provided immediate access to the
identified consequence two out of five times. When the participant did not receive immediate
access to the identified consequence, the researcher would deny the participant’s request (e.g.,
“No.”), and the researchers used BST to teach the participant to perform the tolerance response
(e.g., taking a deep breath). Initially, if the participant performed the tolerance response, the
researcher provided access to the identified consequence. As the participant independently
Running head: STAFF TRAINING ON THE IISCA 5
performed the tolerance response, the researcher increased the duration to receive access to the
identified consequence. Overall, the results of Hanley et al. (2014) indicated that the IISCA is a
useful tool in identifying and treating challenging behaviors.
Since Hanley et al. (2014), the IISCA received additional research support as being a
viable assessment tool (Ghaemmaghami, Hanley, Jin, & Vanselow, 2016 & Slaton, Hanley, &
Raftery, 2017). Jessel, Hanley, and Ghaemmaghami (2016) demonstrated the utility of the IISCA
in a replication across 30 individuals, between the ages of 1.8 years and 30 years. All participants
had developmental or behavioral disorders (e.g., ASD, ADHD, and GAD) and engaged in some
form of challenging behavior (e.g., aggression, SIB, loud vocalization). Throughout the study,
researchers measured the frequency of challenging behavior (e.g., aggression, SIB, loud
vocalization) and duration of reinforcement. During the functional assessment process,
researchers conducted the IISCA according to Hanley and et al. (2014) methodology. Open-
ended interviews lasted 30-90 min, and direct observations were 10-20 min in duration.
Researchers tested each client’s hypothesized maintaining consequence using a multielement
design. Results of each IISCA indicate convincing support for efficiently identifying the
maintaining consequence. An additional analysis of ten of the 30 participants indicated a
differentiation in responding as early as the first exposure to test conditions. Overall, these
results provide support for IISCA methodology and indicate that synthesized conditions are both
effective and efficient in identifying the maintaining functions for challenging behaviors.
Santiago, Hanley, Moore, & Jin (2016) furthered support for the IISCA in the replication
within a school and home setting. A 14-year-old male and an 11-year-old female diagnosed with
autism spectrum disorder (ASD) participated in the Santiago and et al. (2016) study. Both
participants engaged in several topographies of challenging behaviors (e.g., aggression, SIB,
Running head: STAFF TRAINING ON THE IISCA 6
property destruction). During the assessment, researchers measured the frequency of challenging
behaviors and reinforcement duration, and during supplemental treatment sessions, researchers
measured the FCR, tolerance responses, type of instructions delivered, and compliance to
instructions. Researchers conducted IISCA methodology similar to Hanley et al. (2014);
however, researchers only conducted an open-ended interview, and omitted direct observation,
before the functional analysis. Through the use of the IISCA, researchers hypothesized and
confirmed the first participant’s challenging behaviors to be maintained by escape to tangibles
and attention. Based on the results of the open-ended interview, researchers identified that access
to preferred tangible items; access to preferred conversation topics; and access to adult attention
maintained the second participant’s challenging behaviors. Researchers conducted an isolated,
access to attention, functional analysis. The isolated condition functional analysis failed to
produce conclusive results. Researchers then conducted two synthesized functional analyses,
testing access to a tangible with attention and escape to a preferred conversation topic. The
results of both synthesized functional analyses confirmed the hypothesized consequence of
challenging behaviors. Researchers, along with parents and teachers implemented functional
communication training (FCT), similar to Hanley and et al. (2014). FCT effectively decreased
the frequency of challenging behaviors for both participants and increased correct responding to
denial and delay to a preferred consequence. The results of Santiago and et al. (2016) both
demonstrated the utility of the IISCA and encouraged the use of the IISCA for the greater
community of early intervention service providers.
Replications like Santiago and et al. (2016) are essential to science (Tincani & Travers,
2019). However, very few studies outside of Hanley’s direct supervision have been completed on
the IISCA, although this is not unusual, given how new the IISCA research is. To date, only two
Running head: STAFF TRAINING ON THE IISCA 7
published studies outside of Hanley’s direction, of which we are aware, show support for the
IISCA (Strand & Eldevik, 2017 & Herman, Healy, & Lydon, 2018). Strand and Eldevik (2017)
used the IISCA with one participant in an early behavior intervention program. The results of the
IISCA informed researchers of the maintaining consequences for the participant’s challenging
behavior and researchers successfully implemented a function-based treatment that effectively
decreased challenging behaviors and increased adaptive behaviors. Similarly, Herman, Healy, &
Lydon (2018) conducted the IISCA with an individual that engaged in challenging behaviors.
Researchers used the IISCA to identify the maintaining function of the challenging behaviors,
and additional behavior-based treatments reduced challenging behaviors and increased
compliance. These two studies provided further support for the practicality and generality of the
IISCA; however additional research, outside of Hanley’s direct supervision, is still needed to
further establish its generality (Baer, Wolf, & Risley, 1968).
Given the accumulating publications on the successful use of the IISCA, no published
research identifies if staff can be trained to implement the IISCA with high fidelity. However,
multiple studies on training single condition functional analyses have been published. Iwata et al.
(2000), for example, demonstrated individuals with little clinical experience acquire skills
necessary for conducting a single condition, function analysis after receiving basic training.
Training procedures consisted of reading a published, single condition, functional analysis
article, watching a video recorded simulation of correct procedural implementation, passing a
written quiz, and receiving feedback on performance during sessions. Similarly, Kunnavatana,
Bloom, Samaha, and Dayton (2013) trained four teachers to conducted trial-based functional
analyses within a classroom. Training consisted of participants observing a video model that
depicted trial-based functional analysis methodology; recording observational data, based on the
Running head: STAFF TRAINING ON THE IISCA 8
video model; and observing how data helps determine the function of the behavior. Additionally,
researchers used role-play with immediate feedback to ensure that the participants implemented
trial-based functional analysis methodology with high fidelity. The results demonstrated that all
teachers’ fidelity improved after training, and teachers conducted trial-based functional analysis
above baseline scores, when teachers applied trial-base functional analysis methodology with
actual students. The research described above suggests practitioners can be trained in
experimental single condition FA methodology but no prior research, of which we are aware, has
attempted to train practitioners to implement the IISCA.
The purpose of the current study is to evaluate a behavioral skills training procedure for
training professional on how to implement the IISCA with challenging behaviors exhibited by
children with autism. This study (a) trained practitioners to implement the IISCA with high
fidelity, (b) evaluated the practitioners’ implementation of the IISCA with real clients in order to
identify synthesized contingencies for challenging behaviors, and (c) evaluated the extent to
which practitioners view the IISCA as a socially acceptable assessment.
Methods
Participants and Setting
Three individuals volunteered to participate in this study. All three participants were
employed by an agency that provides behavioral early intervention services for children with
ASD. All participants were professionals who supervise the implementation of behavior analytic
interventions for clients that displayed varying degrees of developmental delays and challenging
behaviors. Throughout this study, researchers conducted baseline, training, and post-training
(i.e., rehearsal) sessions at each participant’s clinical site. During each one-on-one session, the
primary researcher and the participant met in an office within the clinic, to minimize distractions.
Running head: STAFF TRAINING ON THE IISCA 9
Office rooms were equipped with a computer, camcorder, study related materials (e.g., data
sheet, preferred toys), and at least two chairs. Sessions ranged in occurrence from 1 to 2 times
per week to once every other week. Additionally, session duration ranged from 2 min to 45 min
across all conditions.
Before baseline, all three participants expressed knowledge of functional assessments but
had not received any direct training on IISCA methodology before participation in this study.
The first participant, Finley, held a masters degree and worked as a BCBA for two and a half
years. Finley had supervised and managed the implementation of behavioral interventions for
LeBron for over six months. LeBron was a 4-year-old boy diagnosed with ASD. LeBron
received support from a registered behavior technician (RBT) at preschool, home, and at the
agency’s center. According to Finley’s report, LeBron displayed varying degrees of lying
behavior at home and during clinical sessions. Finley defined LeBron’s lying as inaccurately
responding to a discriminative stimulus and maintaining the incorrect response after receiving
corrective feedback, across settings. For example, when a person approached LeBron, LeBron
pointed to the person and called the person a name (e.g., “Bee Bee” or “Amanda”). Despite
LeBron receiving feedback for the person’s name, LeBron’s behavior persisted.
Jordyn held a masters degree and had worked as a BCBA for five years. Throughout the
study, Jordyn supervised and managed the implementation of behavioral interventions for King
for over ten months. King was a 7-year-old boy diagnosed with ASD. King received support
from an RBT both at his home and at the agency’s center. King’s mother and clinical staff
reported that King displayed varying degrees of tantrum behaviors (e.g., self-injurious behavior,
noncompliance, and crying). According to King’s mother and clinical staff, King’s tantrum
behaviors often occurred when King’s mother left the clinical site or home without King.
Running head: STAFF TRAINING ON THE IISCA 10
Additionally, Jordyn identified that tantrum behaviors were preceded by King following King’s
mother, King yelling, “No,” and King grabbing and pulling on King’s mother’s clothes and hand.
At the time of the study, Baylor was enrolled in a masters program with a BCBA-
approved course sequence and had been employed by the early intervention agency for eleven
years. Baylor had assisted in the supervision of James for over three months. James was a 3-year-
old boy diagnosed with ASD. James received intervention sessions at the agency’s center.
According to the clinical staff’s observations, James displayed varying degrees of tantrum
behaviors. Tantrum behaviors included self-injurious behavior (i.e., falling to the floor),
screaming, and crying. The staff reported that James’s behavior often occurred when male
employees entered the room. James’s tantrum behavior was preceded by James saying
“Goodbye” to the male therapist or James manding to walk to a room where a female staff
member was located.
Experimental Design
A concurrent, multiple probe design was used to evaluate the effects of the training
across the three participants. After training was complete, researchers conducted a maintenance
probe and generalization probe for all participants.
Measurement
The primary researcher collected all dependent measures through direct observations
across all sessions. However, during the generalized probe of the IISCA, participants acted as the
primary data collectors, while the researchers observed the recorded sessions.
Implementation of the IISCA. Throughout the study, researchers measured the
percentage of IISCA procedures correctly implemented, as specified in the task analysis data
sheet (Figure 1 & Figure 2). By using the task analysis data sheet, researchers recorded each
Running head: STAFF TRAINING ON THE IISCA 11
participant’s performance across the following six domains (1) conducting the open-ended
functional assessment interview (Figure 3, Figure 4, & Figure 5; Hanley 2012), (2) evaluating
the target behavior through a direct observation, (3) designing assessment conditions (e.g., define
the target behaviors and hypothesized putative reinforcer), (4) implementing the control
condition, (5) implementing the test condition, and (6) analyzing the results of the assessment.
Researchers scored responses as either correct or incorrect. Researchers then calculated the
percentage of correct responses by dividing the sum of correct responses by the total number of
responses and multiplying the resulting quotient by 100%.
Frequency of challenging behavior during the experimental functional analysis.
Participants collected data using pen and paper. Similar to Santiago et al. (2016), participants
scored each instance of challenging behavior or any identified precursor behaviors. Upon the
completion of each condition, participants calculated the rate of challenging behavior by dividing
the total frequency of challenging behavior recorded by the duration of the condition.
Interobserver agreement. During the training for the IISCA, researchers assessed
interobserver agreement (IOA) by having a second observer collect data on the participant’s
target behaviors. The second observer reviewed recordings of at least 33% of all conditions for
each participant. Researchers scored the point-by-point adherence to the task analysis data sheet
for each session. Observers calculated IOA by dividing the sum of agreements by the total
number of agreements and disagreements and multiplying the resulting quotient by 100%. The
mean IOA for the implementation IISCA procedures was 98% (range, 95%-100%) for all
participants, 95% (range, 88% to 100%) for Finley, 100% for Jordyn, and 100% for Baylor.
During the application of the functional analysis, a second observer observed every condition for
each client. Observers calculated IOA of each recorded response by dividing the sum of the
Running head: STAFF TRAINING ON THE IISCA 12
agreements by the total number of agreed and disagreed recorded responses and multiplying the
resulting quotient by 100%. The mean IOA of challenging behavior was 100% agreement across
all participants’ implementation of the IISCA with clients.
Procedures
Training.
Before baseline. Researchers created ten fictitious client profiles. Each profile included a
client’s demographic information, scripted answers to an open-ended interview, and a short 45 s
to 75 s taped observation of the client’s identified challenging behavior. Researchers used
scripted open-ended interview answers during interviews that the participants conducted. In the
cases that participants conducted an experimental analysis, the researchers portrayed the
hypothetical client and enacted the target behavior in occurrence to the pre-established
hypothetical function.
Baseline. A week before each participant’s first baseline session, each participant
received a hard copy of the Hanley et al. (2014) article. Baseline session began with the
participant selecting a card that corresponded to a fictitious client’s profile. Once a profile was
selected, the researchers removed the card from the deck, so that the participant would receive a
different profile each session. Researchers then provided the participant with a description of the
fictitious client’s demographic information (e.g., age, preferred tangibles, and verbal abilities).
Additionally, researchers informed the participant that the participant was consulted to conduct
an IISCA for the client and requested how the participant would like to proceed. In the case that
the participant requested an interview, the researchers acted as the interviewee (e.g., the client’s
parent or teacher) and provided the scripted answers, outlined for the client. If the participant
requested an observation of the client, the researchers also provided the taped observation. In the
Running head: STAFF TRAINING ON THE IISCA 13
case that the participant requested to conduct a functional analysis, the researcher acted as the
fictitious client and only displayed the target behavior if the putative reinforcer was removed.
Researchers did not provide any programmed consequences contingent on the participant’s
adherence to the task analysis data sheet, and if the participant asked for information not
included in the client profile, the researcher would tell the participant the information is
unavailable
Behavioral skills training. Researchers employed BST (Krumhus & Malott, 1980) to
train participants on the implementation of the IISCA. During the training, researchers provided
participants with instructions on IISCA methodology, a model of procedures, and rehearsals with
feedback on all components of the IISCA. Each participant remained in the rehearsal with
feedback phase of BST until the participant correctly completed all components of the IISCA
with over 85% accuracy, across two consecutive sessions.
Instruction. During the first component of the training, researchers provided participants
with instructions on how to complete each component of the IISCA. During this phase,
researchers disseminated an IISCA checklist (Figure 6) to each participant. Researchers modified
the checklist used by Alnemary, Wallace, Alnemary, Gharapetian, and Yassine (2016) to outline
the steps of an IISCA. After participants reviewed the checklist, each participant watched a
PowerPoint presentation that both described the purpose of the IISCA and outlined all the
components of the IISCA, according to the checklist. During the description of the procedures,
researchers provided a tutorial on the open-ended functional assessment interview that reviewed
questions to identify a client’s current skills level, challenging behavior, contexts in which the
challenging behavior typically occurs, and typical responses that occur before and after
challenging behaviors. The PowerPoint also explained the purpose of direct observations. The
Running head: STAFF TRAINING ON THE IISCA 14
PowerPoint then reviewed the purpose of operationally defining a target behavior to include
precursor behavior, identifying punitive reinforcers, and describing the control and test
conditions. Next, the PowerPoint provided an automated example of the control and test
conditions. Finally, researchers reviewed potential scenarios for analyzing the results of the
functional analysis. During the analysis portion, the participant observed one example of clear
differentiation between test and control conditions and one example of a scenario that required
additional exposure to test and control conditions before the participant observed a clear
differentiation between the test and control conditions. The entire instruction phase lasted less
than 30 min in duration.
Model. Participants then observed a model of the correct implementation of IISCA
procedures. The primary researcher demonstrated each component of IISCA methodology,
according to Hanley et al., (2014). The demonstration included a synthesized training example
which contained a simulated interview, an observation of the challenging behavior, planning the
control and test conditions, the functional analysis, and an evaluation of the functional analysis’s
results. During the model, the primary researcher playing the role of the assessor, and the
participant played the role of the target client. The entire modeling phase lasted 15 min in
duration.
Rehearsal and Feedback. Researchers conducted rehearsal sessions similar to baseline.
During rehearsal sessions, participants selected a card that corresponded to a novel fictitious
client’s profile. Researchers provided the participant with the fictitious client’s demographic
information, informed the participant that the participant was consulted to conduct an IISCA for
the client, and requested how the participant would like to proceed. If the participant did not
respond within 10 s of the antecedent verbal stimulus, researchers delivered a verbal prompt for
Running head: STAFF TRAINING ON THE IISCA 15
the participant to consult the IISCA checklist. Throughout the rehearsal with feedback phase,
participants received positive praise for correct responses (i.e., completing the corresponding
step within the task analysis data sheet). Incorrect responses received an immediate, verbal
prompt and a representation of the corresponding step.
Maintenance. Participants received maintenance probes two to four weeks after each
participant met the mastery criteria. Researchers conducted maintenance probes identical to
baseline, and the fictitious clients were ones that had not been used in training before.
Application of IISCA. After completing the rehearsal phase, each participant
implemented the IISCA with their corresponding client. However, given each participant’s
familiarity with their respective clients, the researchers allowed for minor modifications to
trained IISCA procedures. For example, during the interview portion, Jordyn and Baylor
completed the open-ended functional assessment interviews based on their respective client’s
clinical staff’s previous reports. Following the modified interview, researchers did not require
each participant to observe the target behavior, as each participant indicated that they had
previously observed the target behavior. Participants then planned a functional analysis, by
operationally defining the target behavior, precursor behaviors, and hypothesized test and control
conditions. During the functional analysis, the participant tested the hypothesized putative
reinforcers by comparing each client’s rate of challenging behavior in alternating control and test
conditions (e.g., control, test, control, test, test). During the control conditions, participants made
the putative reinforcers continuously available throughout the session. During the test condition,
participants removed all putative reinforcers at the start of the session and returned putative
reinforcers for 30 s, contingent on the occurrence of either challenging behavior or precursor
behavior. Throughout the functional analysis, participants only controlled for the putative
Running head: STAFF TRAINING ON THE IISCA 16
reinforcers, and all other materials remained noncontingently available in both the control and
test conditions. Thus, the only differences between the control condition and the test condition
were the hypothesized establishing operations, reported to evoke challenging behavior, and the
occurrence of challenging behavior or precursor behavior produced 30 s of access to the putative
reinforcer. Contingencies in the functional analysis were unique to each participant and included
escape from receiving additional corrective feedback to a previously mastered target and access
to attention (LeBron); access to adult attention (King); and escape from male clinical staff and
access to female attention (James).
Social Validity
Researchers collected measures of social validity for each participant. Before baseline
and after the maintenance session, researchers asked each participant to complete the social
validity measure on functional assessments (Figure 7 & Figure 8). The social validity requested
that each participant compare a combination of indirection and descriptive functional assessment
procedures commonly completed in EIBI programs. Participants rated each assessment
according the following domains: (1) the perceived usefulness of an assessment with your
practice (0: unsure; 1: not useful; 5 very useful), (2) how often the assessment is used within
your practice (0: unsure; 1: not often; 5 very often), (3) the practicality of an assessment within
with your practice (0: unsure; 1: not practical; 5 very practical), (4) the perceived accuracy of an
assessment with your practice (0: unsure; 1: not accurate; 5 very accurate), and (5) the perceived
familiarity of an assessment’s methodology (0: unsure; 1: not familiar; 5 very familiar).
Running head: STAFF TRAINING ON THE IISCA 17
Results
Training
Figure 9 depicts the percentage of correct implementation across all components of the
IISCA. During baseline, Finley requested an observation of the challenging behavior in 100% of
sessions, correctly planned the functional analysis in 67% of sessions, and correctly analyzed the
function of the challenging behavior in 33% of session. However, Finley did not correctly
conduct the open-ended functional assessment interview nor implement the control and test
conditions of the experimental functional analysis. Both Jordyn and Baylor demonstrated no
correct responses across any components of the IISCA during baseline.
During the rehearsal phase, each participant’s percentage of correct implementation
increased substantially. Finley received feedback on the implementation for the test condition of
the functional analysis and the analysis of the functional analysis during the first rehearsal
session. Thereafter, Finley correctly responded with 100% accuracy across all components of the
IISCA. Jordyn initially implemented the test condition of the functional analysis with 92%
accuracy and all remaining components of the IISCA were implemented with 100% accuracy.
During the second rehearsal session, Jordyn correctly implemented the interview (100% correct),
observation (100% correct), and planning (100% correct). Jordyn received feedback on the
implementation for the control condition (75% correct), test condition (63% correct), and
analysis of the IISCA (50% correct). During the third and fourth rehearsal, Jordyn correctly
responded with 100% accuracy across all components of the IISCA. Baylor initially
implemented all components of the IISCA with 100% accuracy. During Baylor’s second
rehearsal, Baylor implemented the control condition of the functional analysis with 88%
accuracy and all remaining components of the IISCA were implemented with 100% accuracy.
Running head: STAFF TRAINING ON THE IISCA 18
The maintenance probe demonstrated a continued high percentage of correct responding
(100% correct) across all components of the IISCA. Finely received a maintenance probe four
weeks after meeting the mastery criteria of the rehearsal phase, Jordyn received a maintenance
probe three weeks after meeting the mastery criteria of the rehearsal phase, and Baylor received a
maintenance probe two weeks after meeting the mastery criteria of the rehearsal phase.
The mean performance on the application of the IISCA was 100% correct responding
cross all participants.
Trainee Implementation of IISCA with Children with Autism
Figure 10 depicts the results of each client’s IISCA. Based on Jordyn’s modified
interview with King’s mother, Jordyn hypothesized that King’s challenging behavior was
maintained by access to King’s mother’s attention (Figure 10, top). When Jordyn coached King’s
mother to provide attention, King’s challenging behavior was at zero occurrences per minute.
During the test condition, King’s challenging behavior remained at zero occurrences per minute.
Finally, Baylor hypothesized that James’s challenging behavior was maintained by escape from
male clinical staff and access to female attention (Figure 3, bottom). When Baylor provided
James with access to female attention, James’s challenging behavior was at zero occurrences per
minute. When the female clinical staff left the room and a male staff entered the room, James’s
mean level of observed challenging behavior was 2.5 occurrences per minute.
Social Validity
Figure 11 depicts the change in Finley and Baylor ratings of indirect and descriptive
functional assessment procedures. Before training, Finley rated direct observations (e.g., ABC
data) the most useful assessment (5: very useful), the most frequently used assessment (5: very
often), and the most familiar assessment (5: very familiar). Finley rated the trial-based functional
Running head: STAFF TRAINING ON THE IISCA 19
analysis, Functional Analysis Screening Tool (FAST), direct observations, and scatter plots as
being the most practical assessments (4: mostly practical). Finley rated the analog functional
analysis as the most accurate assessment (5: most accurate). Finley rated the IISCA according to
the following domains: perceived usefulness (0: unsure), frequency of use (1: never used),
practicality of use (0: unsure), perceived accuracy (0: unsure), and perceived familiarity (1:
unfamiliar). After completing the maintenance probe Finley’s ratings of the IISCA increased the
most across all domains: perceived usefulness (increase of 3), frequency of use (increase of 1),
practicality of use (increase of 2), perceived accuracy (increase of 3), and perceived familiarity
(increase of 4)
Before training, Jordyn rated direct observations the most useful assessment (5: very
useful), the most frequently used assessment (5: very often), and the most accurate assessment
(5: very accurate). Finley rated the FAST and direct observations as being the most practical
assessments (5: very practical) and most familiar assessments (5: very familiar). Jordyn rated the
IISCA according to the follow domains: perceived usefulness (0: unsure), frequency of use (1:
never used), practicality of use (0: unsure), perceived accuracy (0: unsure), and perceived
familiarity (1: unfamiliar).
Before training, Baylor rated the TBFA, direct observations, and analog functional
analysis as the most useful assessment (5: very useful). Direct observations were rated as the
most frequently used assessment (5: very often). Direct observations, scatter plots, and
Functional Assessment Interview Form (FAIF) were rated as the most practical assessments (5:
very practical). The TBFA, direct observations, analog functional analysis, and IISCA were rated
as the most accurate functional assessments (4: mostly accurate). The FAST and direct
observations were rated as the most familiar assessments (5: very familiar). Baylor rated the
Running head: STAFF TRAINING ON THE IISCA 20
IISCA according to the follow domains: perceived usefulness (4: mostly useful), frequency of
use (1: never used), practicality of use (4: mostly practical), perceived accuracy (4: mostly
accurate), and perceived familiarity (1: unfamiliar). After completing the maintenance probe
Baylor’s ratings of the IISCA either increased or remained the same across all domains:
perceived usefulness (no increase), frequency of use (no increase), practicality of use (increase of
1), perceived accuracy (no increase), and perceived familiarity (increase of 4).
Discussion
The results of this research indicated that practitioners effectively learned to implement
IISCA methodology with high fidelity. In the current study, the fidelity of the IISCA increased
after participants received instructions and a model of IISCA procedures. Moreover, each
practitioner demonstrated a mastery of the IISCA within four or less rehearsal with feedback
sessions and maintained high fidelity during the maintenance probe. These results suggest that
the IISCA is likely a trainable assessment for practitioners and that training can be relatively
rapid and therefore efficient.
In addition to demonstrating the trainees can implement functional analysis procedures
with accuracy during training, which is where most pervious research concludes, two trainees
had clients who were in need of IISCA assessments and therefore provided opportunities for the
trainees to implement the IISCA in real life. Both trainees implemented the IISCA with 100%
procedural integrity. This demonstration offers evidence for a practitioner’s capability to apply
IISCA with high fidelity for clients with various topographies of challenging behaviors after
receiving a relatively brief training. Evaluating whether trainees can actually use functional
analysis methodology after training is an important next step that is often lacking in previous
research on functional analysis training. Ultimately, training is of little value if it does not result
Running head: STAFF TRAINING ON THE IISCA 21
in trainees actually being able to implement their newly learned skills with the clients they work
with on a daily basis.
Although the trainees implemented the IISCA with 100% procedural integrity with their
clients, the results of the IISCA differed for the two clients. For James, the experimental
functional analysis identified that challenging behavior occurred at clearly differentiated rates
between control and test conditions, thereby indicating the potential maintaining function of the
challenging behavior. King’s challenging behavior did not occur during his IISCA, therefore
preventing the IISCA from yielding results. After the IISCA was complete, the lack of
challenging behavior was discussed with his mother, who disclosed that she had implemented an
intervention five days before the functional analysis. King’s mother reported that King’s
challenging behavior had decreased after the intervention was implemented. Therefore, King’s
lack of challenging behavior during his IISCA could have been the result of the effective
behavioral intervention and a change in King’s motivating operations. Of course, this possibility
must be interpreted with caution. It is also possible that his IISCA was simply not effective for
him, despite the fact that it was implemented with high fidelity. This could be interpreted as a
limitation of the current study, however, demonstrating that the IISCA is effective was not the
purpose of the study. The purpose of the study was to evaluate a procedure for training staff on
how to implement the IISCA with high fidelity, accepting the possibility that the IISCA, like all
other procedures, is not guaranteed to be effective with 100% of clients. Overall, the results of
each client’s functional analysis provide further support for the IISCA as being an effective
assessment for identifying challenging behaviors for children with autism.
The results of the social validity assessment also support the possibility that trainees will
actually be able to implement the IISCA post-training. The social validity measures in the
Running head: STAFF TRAINING ON THE IISCA 22
current study identified that after the training on the IISCA, practitioners’ ratings of the IISCA
either increased or remained high across all measures, while a majority of other assessments’
ratings either decreased or remained the same. Further, these results indicated that each
practitioner’s preference to employ the IISCA within their practice potentially changed as a
result of an increase in the familiarity of IISCA procedures. Of course, it is important to note that
social validity measures consist of self-report, which is inherently biased, so the verbal reports of
staff should always be interpreted with caution. Nevertheless, overall, the results of the study
suggest that the IISCA is readily trainable and that it is acceptable to staff who receive training in
it.
While the results of this research identify the IISCA as a practical assessment for BCBA
working with children with autism, this study is not without limitations. First, researchers did not
collect procedural integrity data on the trainer’s implementation of study procedures. Despite this
limitation, participants received no additional prompts during the implementation of the IISCA,
which indicated that the BST protocol was effective. Still, future research should consider
including procedural integrity data for the experimenters’ behavior. Second, we did not
experimentally evaluate treatments for the client’s challenging behavior following the IISCA
assessments that were conducted by the trainees. Ultimately, analysis of a function-based
treatment based on the results of the IISCA is necessary to ensure the functional assessment’s
validity. However, a growing body of previously published research has demonstrated that
treatments based on the IISCA are effective (Ghaemmaghami, Hanley, Jin, & Vanselow, 2016;
Jessel, Hanley, & Ghaemmaghami, 2016; Slaton, Hanley, & Raftery, 2017). Nevertheless, future
research on training staff to implement the IISCA might consider including treatment evaluations
Running head: STAFF TRAINING ON THE IISCA 23
as well. Finally, participants’ ratings of social validity were potentially influenced by the
identified purpose of the study, as is the case with most measures of social validity.
The present study investigates the practicality of training staff to implement the IISCA
with children diagnosed with autism. Despite the growing support of the IISCA, the field of
ABA should replicate research on the effect of the IISCA. Future replications of the training of
IISCA should address the limitations of this study and extend the results to other populations.
For example, future research could evaluate training staff to implement the IISCA with
populations outside of autism, including children with conduct disorders or emotional
disturbance. In addition, given the small number of clinicians in the world who are currently
knowledge on the IISCA, future research should evaluate methods for scaling training up, for
example, by conducting training via telemedicine. Additionally, future research is needed to
identify if practitioners prefer specific functional assessments, including the IISCA. It is still
relatively uncommon for practitioners to implement experimental functional analyses and it is
possible that the IISCA may be more highly preferred and more likely to be implemented by
practitioners working in real-world settings.
Running head: STAFF TRAINING ON THE IISCA 24
References
Alnemary, F., Wallace, M., Alnemary, F., Gharapetian, L., & Yassine, J. (2017). Application of
a pyramidal training model on the implementation of trial-based functional analysis: A
partial replication. Behavior analysis in practice, 10(3), 301-306.
Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior
analysis 1. Journal of Applied Behavior Analysis, 1(1), 91-97.
Beavers, G. A., Iwata, B. A., & Lerman, D. C. (2013). Thirty years of research on the functional
analysis of problem behavior. Journal of Applied Behavior Analysis, 46(1), 1-21.
Bloom, S. E., Iwata, B. A., Fritz, J. N., Roscoe, E. M., & Carreau, A. B. (2011). Classroom
application of a trial-based functional analysis. Journal of Applied Behavior
Analysis, 44(1), 19-31.
Borrero, C. S., & Borrero, J. C. (2008). Descriptive and experimental analyses of potential
precursors to problem behavior. Journal of applied Behavior analysis, 41(1), 83-96.
Carr, E. G. (1977). The motivation of self-injurious behavior: A review of some
hypotheses. Psychological bulletin, 84(4), 800.
Derby, K. M., Wacker, D. P., Sasso, G., Steege, M., Northup, J., Cigrand, K., & Asmus, J.
(1992). Brief functional assessment techniques to evaluate aberrant behavior in an
outpatient setting: A summary of 79 cases. Journal of Applied Behavior Analysis, 25(3),
713-721.
Ghaemmaghami, M., Hanley, G. P., Jin, S. C., & Vanselow, N. R. (2016). Affirming control by
multiple reinforcers via progressive treatment analysis. Behavioral Interventions, 31(1),
70-86.
Running head: STAFF TRAINING ON THE IISCA 25
Hagopian, L. P., Bruzek, J. L., Bowman, L. G., & Jennett, H. K. (2007). Assessment and
treatment of problem behavior occasioned by interruption of free-operant
behavior. Journal of Applied Behavior Analysis, 40(1), 89-103.
Hanley, G. P. (2012). Functional assessment of problem behavior: Dispelling myths, overcoming
implementation obstacles, and developing new lore. Behavior Analysis in Practice, 5(1),
54-72.
Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003). Functional analysis of problem behavior: A
review. Journal of applied behavior analysis, 36(2), 147-185.
Hanley, G. P., Jin, C. S., Vanselow, N. R., & Hanratty, L. A. (2014). Producing meaningful
improvements in problem behavior of children with autism via synthesized analyses and
treatments. Journal of Applied Behavior Analysis, 47(1), 16-36.
Herman, C., Healy, O., & Lydon, S. (2018). An interview-informed synthesized contingency
analysis to inform the treatment of challenging behavior in a young child with
autism. Developmental Neurorehabilitation, 21(3), 202-207.
Iwata B.A, Dorsey M.F, Slifer K.J, Bauman K.E, Richman G.S. (1994). Toward a functional
analysis of self-injury. Journal of Applied Behavior Analysis, 27, 197–209. (Reprinted
from (1982). Analysis and Intervention in Developmental Disabilities, 2, 3–20).
Iwata, B. A., Wallace, M. D., Kahng, S., Lindberg, J. S., Roscoe, E. M., Conners, J., ... &
Worsdell, A. S. (2000). Skill acquisition in the implementation of functional analysis
methodology. Journal of Applied Behavior Analysis, 33(2), 181-194.
Jessel, J., Hanley, G. P., & Ghaemmaghami, M. (2016). Interview-informed synthesized
contingency analyses: Thirty replications and reanalysis. Journal of Applied Behavior
Analysis, 49(3), 576-595.
Running head: STAFF TRAINING ON THE IISCA 26
Krumhus, K. M., & Malott, R. W. (1980). The effects of modeling and immediate and delayed
feedback in staff training. Journal of Organizational Behavior Management, 2(4), 279-
293.
Kunnavatana, S. S., Bloom, S. E., Samaha, A. L., & Dayton, E. (2013). Training teachers to
conduct trial-based functional analyses. Behavior Modification, 37(6), 707-722.
Lambert, J. M., Staubitz, J. E., Roane, J. T., Houchins-Juárez, N. J., Juárez, A. P., Sanders, K. B.,
& Warren, Z. E. (2017). Outcome summaries of latency-based functional analyses
conducted in hospital inpatient units. Journal of Applied Behavior Analysis, 50(3), 487-
494.
Northup, J., Wacker, D., Sasso, G., Steege, M., Cigrand, K., Cook, J., & DeRaad, A. (1991). A
brief functional analysis of aggressive and alternative behavior in an outclinic
setting. Journal of Applied Behavior Analysis, 24(3), 509-522.
Rispoli, M., Ninci, J., Neely, L., & Zaini, S. (2014). A systematic review of trial-based
functional analysis of challenging behavior. Journal of Developmental and Physical
Disabilities, 26(3), 271-283.
Santiago, J. L., Hanley, G. P., Moore, K., & Jin, C. S. (2016). The generality of interview-
informed functional analyses: Systematic replications in school and home. Journal of
Autism and Developmental Disorders, 46(3), 797-811.
Sigafoos, J., & Saggers, E. (1995). A discrete-trial approach to the functional analysis of
aggressive behaviour in two boys with autism. Australia and New Zealand Journal of
Developmental Disabilities, 20(4), 287-297.
Running head: STAFF TRAINING ON THE IISCA 27
Slaton, J. D., Hanley, G. P., & Raftery, K. J. (2017). Interview-informed functional analyses: A
comparison of synthesized and isolated components. Journal of Applied Behavior
Analysis, 50(2), 252-277.
Strand, R. C., & Eldevik, S. (2018). Improvements in problem behavior in a child with autism
spectrum diagnosis through synthesized analysis and treatment: A replication in an EIBI
home program. Behavioral Interventions, 33(1), 102-111.
Tincani, M. & Travers, J. (2019). Replication research, publication bias, and applied behavior
analysis. Perspectives on Behavior Science, 42(1), 59-75.
Thomason-Sassi, J. L., Iwata, B. A., Neidert, P. L., & Roscoe, E. M. (2011). Response latency as
an index of response strength during functional analyses of problem behavior. Journal of
Applied Behavior Analysis, 44(1), 51-67.
Wilder, D. A., Chen, L., Atwell, J., Pritchard, J., & Weinstein, P. (2006). Brief functional
analysis and treatment of tantrums associated with transitions in preschool
children. Journal of Applied Behavior Analysis, 39(1), 103-107.
Wilder, D. A., Masuda, A., O'Connor, C., & Baham, M. (2001). Brief functional analysis and
treatment of bizarre vocalizations in an adult with schizophrenia. Journal of Applied
Behavior Analysis, 34(1), 65-68.
Running head: STAFF TRAINING ON THE IISCA 28
Figure 1: Task analysis data sheet. This figure illustrates the first page of the data sheet used for
all participants to track fidelity when implementing the IISCA.
Session:
Participant
Client
Date
INTERVIEW OBSERVATION PLANNING IISCA
CONDUCT CONTROL CONDITION
CONTROL CONDITION
Initiate the Open-Ended Functional Assessment Interview
Complete the Open-Ended Functional Assessment Interview
Observe the client under conditions described in the Open-Ended FA Interview
Percentage Correct
Duration
Percentage Correct
Request a direct observation
Duration
IISCA DATA
Develop a definition of the behaviors (i.e., precusor and challenging behavior)
Identify the putative reinforcers
Describe the hypothesized condition in which the challenging behavior occurs
Graph data
Provide access to the putative reinforcer(s)
Start timer
Stay within 5' of the client
Maintain Antecedent
Record Behavior
Maintain Consequence
Percentage Correct
Duration
Percentage Correct
Duration
Running head: STAFF TRAINING ON THE IISCA 29
Figure 2: Task analysis data sheet. This figure illustrates the second page of the data sheet used
for all participants to track fidelity when implementing the IISCA.
Session:
Participant
Client
Date
CONDUCT TEST CONDITION
IISCA DATA
ANALYSIS
Ends IISCA after condition differentation is observed
Identifies the function
Percentage Correct
Graph data
Stay within 5' of the client
Record Behavior
Remove the putative reinforcer(s) after a 30 s reinforcement interval
Notes
Provide the putative reinforcer(s) contingent on the challenging behavior
Remove the putative reinforcer
Provide access to the putative reinforcer(s)
Start timer
TEST CONDITION
Percentage Correct
Duration
Running head: STAFF TRAINING ON THE IISCA 30
Figure 3: Open-ended functional assessment interview. This figure illustrates the first page of the
open-ended functional assessment interview.
Date of Interview: Child/Client: Interviewer:
Respondent: Respondent’s relation to child/client:
1. His/her date of birth: Age: yrs mo Check one: Male Female
2. Describe his/her language abilities:
3. Describe his/her play skills and preferred toys or leisure activities:
4. What else does he/she prefer?
To develop objective definitions of observable problem behaviors:
5. What are the problem behaviors? What do they look like?
To determine which problem behavior(s) will be targeted in the functional analysis:
6. What is the single-most concerning problem behavior?
7. What are the top 3 most concerning problem behaviors? Are there other behaviors of concern?
Open-Ended Functional Assessment Interview
Developed by Gregory P. Hanley, Ph.D., BCBA-D (Developed August, 2002; Revised: August, 2009)
RELEVANT BACKGROUND INFORMATION
QUESTIONS TO INFORM THE DESIGN OF A FUNCTIONAL ANALYSIS
Running head: STAFF TRAINING ON THE IISCA 31
Figure 4: Open-ended functional assessment interview. This figure illustrates the second page of
the open-ended functional assessment interview.
To determine the precautions required when conducting the functional analysis:
8. Describe the range of intensities of the problem behaviors and the extent to which he/she or others may be
hurt or injured from the problem behavior.
To assist in identifying precursors to dangerous problem behaviors that may be targeted in the functional analysis
instead of more dangerous problem behaviors:
9. Do the different types of problem behavior tend to occur in bursts or clusters and/or does any type of
problem behavior typically precede another type of problem behavior (e.g., yells preceding hits)?
To determine the antecedent conditions that may be incorporated into the functional analysis test conditions:
10. Under what conditions or situations are the problem behaviors most likely to occur?
11. Do the problem behaviors reliably occur during any particular activities?
12. What seems to trigger the problem behavior?
13. Does problem behavior occur when you break routines or interrupt activities? If so, describe.
Running head: STAFF TRAINING ON THE IISCA 32
Figure 5: Open-ended functional assessment interview. This figure illustrates the third page of
the open-ended functional assessment interview.
14. Does the problem behavior occur when it appears that he/she won’t get his/her way? If so, describe the
things that the child often attempts to control.
To determine the test condition(s) that should be conducted and the specific type(s) of consequences that may be
incorporated into the test condition(s):
15. How do you and others react or respond to the problem behavior?
16. What do you and others do to calm him/her down once he/she engaged in the problem behavior?
17. What do you and others do to distract him/her from engaging in the problem behavior?
In addition to the above information, to assist in developing a hunch as to why problem behavior is occurring and to
assist in determining the test condition(s) to be conducted:
18. What do you think he/she is trying to communicate with his/her problem behavior, if anything?
19. Do you think this problem behavior is a form of self stimulation? If so, what gives you that impression?
20. Why do you think he/she is engaging in the problem behavior?
Submit by E-mail
Running head: STAFF TRAINING ON THE IISCA 33
Figure 6: IISCA checklist. This figure illustrates the checklist provided to each participant.
IISCA CHECKLIST
INTERVIEW YES NO N/A
Initiate the Open-Ended Functional Assessment Interview
Complete the Open-Ended Functional Assessment Interview
OBSERVATION YES NO N/A
Request a direct observation
Observe the client under conditions described in the Open-Ended FA Interview
PLANNING IISCA YES NO N/A
Develop a definition of the behaviors (i.e., precursor and challenging behavior)
Identify the putative reinforcers
Describe the hypothesized conditions in which the challenging behavior occurs
CONTROL CONDITION YES NO N/A
Provide access to the putative reinforcer(s)
Start timer
Stay within 5’ of the client
Record instances of challenging behavior
Provide no consequence for instances of challenging behavior
Graph data
TEST CONDITION YES NO N/A
Provide access to the putative reinforcer(s)
Start timer
Stay within 5’ of the client
Remove the putative reinforcer(s)
Provide the putative reinforcer(s) contingent on the challenging behavior
Remove the putative reinforcer(s) after a 30 s reinforcement interval
Record instances of challenging behavior
Graph data
ANALYSIS YES NO N/A
End IISCA after condition differentiation is observed
Identify the function
CLIENT: ____________________
DATE: ______________________
Running head: STAFF TRAINING ON THE IISCA 34
Figure 7: Assessment Questionnaire. This figure illustrates the first page of the social validity
measure provided to each participant.
Assessment Questionnaire
Participant: _______________________________________ Date of Survey: ________________
Background Information
Are you a Board Certified Behavior Analyst? ______________________
How long have you been credentialed as a Board Certified Behavior Analyst? _________________________
How many years have you provided applied behavior analytic services? __________________________
How often, in a given year, do you conduct some form of functional behavior assessment? __________________
Assessment Questionnaire
Rate the following functional behavior assessments according to their perceived usefulness within your practice.
Assessment Unsure Not Useful Very Useful
Trial-Based Functional Analysis 0 1 2 3 4 5
Functional Analysis Screening Tool (FAST) 0 1 2 3 4 5
Behavioral Observation (A-B-C recording) 0 1 2 3 4 5
Experimental Functional Analysis (Iwata et al., 1982) 0 1 2 3 4 5
Behavioral Assessment System of Children (BASC) 0 1 2 3 4 5
Scatter Plot 0 1 2 3 4 5
Interview Informed Synthesized Contingency Analysis 0 1 2 3 4 5
Functional Assessment Interview Form (O’Neill et al., 1997) 0 1 2 3 4 5
Rate the following functional behavior assessments according to how often you use them within your practice.
Assessment Unsure Not Often Very Often
Trial-Based Functional Analysis 0 1 2 3 4 5
Functional Analysis Screening Tool (FAST) 0 1 2 3 4 5
Behavioral Observation (A-B-C recording) 0 1 2 3 4 5
Experimental Functional Analysis (Iwata et al., 1982) 0 1 2 3 4 5
Behavioral Assessment System of Children (BASC) 0 1 2 3 4 5
Scatter Plot 0 1 2 3 4 5
Interview Informed Synthesized Contingency Analysis 0 1 2 3 4 5
Functional Assessment Interview Form (O’Neill et al., 1997) 0 1 2 3 4 5
Running head: STAFF TRAINING ON THE IISCA 35
Figure 8: Assessment Questionnaire. This figure illustrates the second page of the social validity
measure provided to each participant.
Rate the following functional behavior assessments according to how practical they are to implement within your practice.
Assessment Unsure Not Practical Very Practical
Trial-Based Functional Analysis 0 1 2 3 4 5
Functional Analysis Screening Tool (FAST) 0 1 2 3 4 5
Behavioral Observation (A-B-C recording) 0 1 2 3 4 5
Experimental Functional Analysis (Iwata et al., 1982) 0 1 2 3 4 5
Behavioral Assessment System of Children (BASC) 0 1 2 3 4 5
Scatter Plot 0 1 2 3 4 5
Interview Informed Synthesized Contingency Analysis 0 1 2 3 4 5
Functional Assessment Interview Form (O’Neill et al., 1997) 0 1 2 3 4 5
Rate the following functional behavior assessments according to your perceived accuracy of the assessment’s outcome.
Assessment Unsure Not Accurate Very Accurate
Trial-Based Functional Analysis 0 1 2 3 4 5
Functional Analysis Screening Tool (FAST) 0 1 2 3 4 5
Behavioral Observation (A-B-C recording) 0 1 2 3 4 5
Experimental Functional Analysis (Iwata et al., 1982) 0 1 2 3 4 5
Behavioral Assessment System of Children (BASC) 0 1 2 3 4 5
Scatter Plot 0 1 2 3 4 5
Interview Informed Synthesized Contingency Analysis 0 1 2 3 4 5
Functional Assessment Interview Form (O’Neill et al., 1997) 0 1 2 3 4 5
Rate the following functional behavior assessments according to your familiarity with their methodology.
Assessment Unsure Not Familiar Very Familiar
Trial-Based Functional Analysis 0 1 2 3 4 5
Functional Analysis Screening Tool (FAST) 0 1 2 3 4 5
Behavioral Observation (A-B-C recording) 0 1 2 3 4 5
Experimental Functional Analysis (Iwata et al., 1982) 0 1 2 3 4 5
Behavioral Assessment System of Children (BASC) 0 1 2 3 4 5
Scatter Plot 0 1 2 3 4 5
Interview Informed Synthesized Contingency Analysis 0 1 2 3 4 5
Functional Assessment Interview Form (O’Neill et al., 1997) 0 1 2 3 4 5
Running head: STAFF TRAINING ON THE IISCA 36
Figure 9: Concurrent multiple-probe graph across participants. This figure illustrates each
participant’s percentage of correct implementation of each component of the IISCA.
0
10
20
30
40
50
60
70
80
90
100
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Percentage of Correct Responding
Sessions
Week 4
Baseline Rehearsal Maint. Generalization
Finley
0
10
20
30
40
50
60
70
80
90
100
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Percentage of Correct Responding
Sessions
Jordyn
0
10
20
30
40
50
60
70
80
90
100
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
Percentage of Correct Responding
Sessions
Week 2
Interview
Observation
Planning
Control
Test
Analysis
Baylor
Running head: STAFF TRAINING ON THE IISCA 37
Figure 10: IISCA for King and James. This figure illustrates both client’s rate of challenging
behavior during the functional analysis.
-0.5
0
0.5
1
1 2 3 4 5 6
Challenging Behavior (Bx/min)
Session
Control
Test
King
-0.5
0
0.5
1
1.5
2
2.5
3
1 2 3 4
Challenging Behavior (Bx/min)
Session
Control
Test
James
Running head: STAFF TRAINING ON THE IISCA 38
Figure 11: Change in Finley and Baylor’s social validity scores. This figure depicts the change in
two participant’s ratings of indirection and descriptive functional assessment procedures.
-5
-4
-3
-2
-1
0
1
2
3
4
5
Useful Often Practical Accurate Familiar
Change in Score
Finley
TBFA FAST Direct Observation Analog Functional Assessment BASC Scatter Plot IISCA FAIF
-5
-4
-3
-2
-1
0
1
2
3
4
5
Useful Often Practical Accurate Familiar
Change in Score
Baylor
TBFA FAST Direct Observation Analog Functional Assessment BASC Scatter Plot IISCA FAIF
Abstract (if available)
Linked assets
University of Southern California Dissertations and Theses
Conceptually similar
PDF
Evaluating the effectiveness of interview-informed synthesized contingency analysis for survivors of traumatic brain injury
PDF
Use of a clinical decision model to support the transition of services for individuals with autism spectrum disorder
PDF
The effects of bilingual acceptance and commitment training (ACT) on exercise in bilingual international university students
PDF
Let’s do something different: evaluation of lag schedules of reinforcement to increase flexibility and promote generalization in children with autism
PDF
The effects of acceptance and commitment therapy-based exercises on eating behaviors in a laboratory setting
PDF
The use and perceptions of experimental analysis in assessing problem behavior of public education students: the interview - informed synthesized contingency analysis (IISCA)
PDF
Behavioral coaching for athlete health behaviors
PDF
Clinical decision analysis for effective supervision: a review of the supervisor-supervisee relationship
PDF
Using acceptance and commitment training to enhance the effectiveness of behavioral skills training
PDF
Evaluating the effects of virtual reality and acceptance and commitment therapy on music performance anxiety
PDF
What “works” and why it still isn't enough: sexual harassment training measures of effectiveness
PDF
Effects of preferred physical activity on stereotypical behaviors in children with autism spectrum disorder: adapting from in-person to telehealth
PDF
A preliminary evaluation of a telehealth approach to acceptance and commitment training (ACT) for enhancing behavioral parent training (BPT) for Chinese parents
PDF
The effect of present moment awareness and value intervention of ACT on impulsive decision-making and impulsive disinhibition
PDF
Behavior analysis in medicine: a clinical decision model to support patient care
PDF
Using Acceptance and Commitment Training and a virtual patient model for supporting families with children with autism spectrum disorder
PDF
Values and persistence
PDF
The effects of dual coding theory on memory retention
PDF
Using rate-building to decrease latency and increase accuracy for task completion via telehealth
PDF
Using GIS to predict human movement patterns in complex humanitarian emergencies: a test case of the Syrian Conflict
Asset Metadata
Creator
Campbell, Vincent Edward
(author)
Core Title
Training staff to implement the interview-informed synthesized contingency analysis (IISCA)
School
College of Letters, Arts and Sciences
Degree
Master of Science
Degree Program
Applied Behavior Analysis
Publication Date
12/11/2019
Defense Date
08/27/2019
Publisher
University of Southern California
(original),
University of Southern California. Libraries
(digital)
Tag
autism,behavior analysis,challenging behavior,functional analysis,IISCA,OAI-PMH Harvest,Staff Training
Language
English
Contributor
Electronically uploaded by the author
(provenance)
Advisor
Tarbox, Jonathan (
committee chair
), Cameron, Michael (
committee member
), Harris, Jennifer (
committee member
), Manis, Frank (
committee member
)
Creator Email
campbvin@gmail.com,vincenec@usc.edu
Permanent Link (DOI)
https://doi.org/10.25549/usctheses-c89-250289
Unique identifier
UC11673256
Identifier
etd-CampbellVi-8047.pdf (filename),usctheses-c89-250289 (legacy record id)
Legacy Identifier
etd-CampbellVi-8047.pdf
Dmrecord
250289
Document Type
Thesis
Rights
Campbell, Vincent Edward
Type
texts
Source
University of Southern California
(contributing entity),
University of Southern California Dissertations and Theses
(collection)
Access Conditions
The author retains rights to his/her dissertation, thesis or other graduate work according to U.S. copyright law. Electronic access is being provided by the USC Libraries in agreement with the a...
Repository Name
University of Southern California Digital Library
Repository Location
USC Digital Library, University of Southern California, University Park Campus MC 2810, 3434 South Grand Avenue, 2nd Floor, Los Angeles, California 90089-2810, USA
Tags
autism
behavior analysis
challenging behavior
functional analysis
IISCA