Discussion

 Hanley (2012) discussed a number of obstacles to conducting functional analyses of problem behaviors. Select 3 obstacles most relevant to you and discuss how to overcome them. This can either be based on experience or interest – but you must refer to the academic literature to support the discussion. 

REQUIREMENTS: 

 See attached. Focus on the document discussion post rubric. Based on the APA 7 ed with support from at least 5 academic sources which need to be journal articles or books from 2019 up to now. NO WEBSITES allowed for reference entry. Include doi, page numbers, etc. Plagiarism must be less than 10%. Also focus on Chapter 27 in Cooper, Heron, and Heward (2007)  and articles Hanley 

Discussion Post Rubric
(20) Possible Points

Category 4 Points 2 Points 0 Points

Length of Post The author’s post
consisted of 150 – 200
words.

The author’s post
consisted of 150 – 100
words.

The author’s post
consisted of 100 words
or less.

Grammar, usage,
spelling

The author’s post
contained less than 2

The author’s post
contained 3 – 4

The author’s post
contained over 5

grammar, usage, or grammar, usage, or grammar, usage, or
spelling errors. spelling errors. spelling errors and
proofreading was not
apparent.

Referencing and
utilizing outside
sources

The author posted
references in APA
format and cited an
one or more original
references, outside of
the assigned readings.

The author posted
references in APA
format of assigned
readings, but did not
include an additional
reference.

The author neither
utilized APA format or
referenced material
used nor cited an
outside reference.

Promotes
Discussion

The author’s post
clearly responds to the
assignment prompt,

The author’s post
responds to the
assignment prompt,

The author’s post does
not correspond with
the assignment

develops ideas but relies heavily on prompt,
cogently, organizes definitional mainly discusses
them logically and explanations and does personal opinions,
supports them through not create and develop irrelevant information,
empirical writing. The original ideas and or information is
author’s post also support them logically. presented with limited
raises question or The author’s post may logic and lack of
stimulates discussion. stimulate some development and
discussion. organization of ideas.
Does not support any
claims made.

Timely Response Assignment is posted
on or prior to due date.

Assignment is one day
late.

Assignment is two
days late.

Be advised, there are also response costs associated with specific behaviors:

• response cost of (3) points will be administered for not responding to a
peer’s post.

• response cost of (1) point will be administered for not reading all of peers’
posts.

• Discussion posts that are turned in more than two days after the due date
will not be accepted unless otherwise excused by the instructor.

problem
ding the

across
pro-

from pub-
of myths

pervasive
will be

the func-
a
analysis,

Finally, the
to others

property of

as-
interviews,

erspectives
Functional Assessment of Problem Behavior: Dispelling Myths,
Overcoming Implementation Obstacles, and Developing New Lore
Gregory P Hanley, Western New England University

ABSTRACT

Hundreds of studies have shown the effcacy of treatments for
behavior based on an understanding of its function. Assertions regar
legitimacy of different types of functional assessment vary substantially
published articles, and best practices regarding the functional assessment
cess are sometimes diffcult to cull from the empirical literature or
lished discussions of the behavioral assessment process. A number
regarding the functional assessment process, which appear to be
within different behavior-analytic research and practice communities,
reviewed in the context of an attempt to develop new lore regarding
tional assessment process. Frequently described obstacles to implementing
critical aspect of the functional assessment process, the functional
will be reviewed in the context of solutions for overcoming them.
aspects of the functional assessment process that should be exported
versus those features that should remain the sole technological
behavior analysts will be discussed.
Keywords: autism, descriptive assessment, functional analysis, functional
sessment, indirect assessment, intellectual disabilities, open-ended
problem behavior

A

PERSPECTIVES Behavior Analysis in Practice, 5(1), 54-72 5

4

fter a conversation with Timothy a review of functional analysis proce-
dures being published several years later
(Hanley, Iwata, & McCord, 2003).

The 277 articles aggregated in that
review, along with the hundreds that
have been published since 2000, are the
primary reasons practitioners are able to
conduct effective functional assessments
of problem behavior. Much has been
learned from the functional assessment
research base. Nevertheless, best prac-
tices regarding the functional assessment
process are sometimes diffcult to cull
from this massive empirical literature. I
never forgot about the idea of contribut-
ing an article that attempted to answer
questions that arose when one put down
an empirical study and attempted to
conduct a functional assessment. This
article is an attempt to fll in the gaps that
exist between how the functional assess-
ment process is described in published

research articles and book chapters and
how it probably should be practiced, at
least from my perspective.

This perspective piece is not merely
a set of opinions however; it is a review
of relevant existing literature synthesized
with my own practice commitments.
Some readers may disagree with particu-
lar assertions in this paper and lament
that the assertion may not be followed
by an empirical reference. I do include
references when a satisfactory analysis
has been conducted, but I admit that
some of my assertions have developed
through both experience conducting
functional assessments and from my
own conceptual interpretation of exist-
ing analyses.

There are still many important ques-
tions to be asked about the manner in
which problem behavior is understood
prior to treating it, and I look forward

Vollmer, one of my graduate school
professors at the time, in which

we discussed the subtle differences in
the manner in which we had learned to
conduct functional assessments of severe
problem behavior, we concluded that a
paper describing functional assessment
“lab lore” would be important and well
received by those who routinely con-
ducted functional assessments. By “lab
lore” we were referring to the commit-
ments people had to the various strate-
gies and tactics involved in the process of
fguring out why someone was engaging
in severe problem behavior. My graduate
school advisor, Brian Iwata, suggested
that rather than focus on lore that I fo-
cus on detecting the different functional
assessment commitments by reviewing
the literature base that existed. These
collective interactions eventually led to

to reading and hopefully conducting some of that research, but
practitioners cannot wait for this next generation of studies
to be conducted. They need to know what to do today when
given the opportunity to help a family or teacher address the
severe problem behavior of a person in their care. I hope that
this paper will help practitioners develop their own set of
commitments regarding the functional assessment process and
perhaps also stimulate some important future research if an
assertion occasions skepticism from those who have different
commitments.

Some Rationales for Conducting a Functional Assessment

What is a functional assessment of problem behavior?
Despite the availability of a variety of functional assessment
forms, you can’t hold it in your hand—it is a process that
involves a lot of highly discriminated, professional behavior.
More precisely, it is a process by which the variables infuenc-
ing problem behavior are identifed. Why engage the process?
Because it allows you to identify an effective treatment for
severe problem behavior.

Behavior modifcation has been effectively used for many
years to address problem behavior, especially of those with
autism or intellectual disabilities (e.g., Hall et al., 1972; Risley,
1968). So you may be thinking, why conduct a functional
assessment of problem behavior? In other words, assigning
powerful but arbitrary reinforcers for not engaging in problem
behavior or for behavior incompatible with problem behavior
and assigning powerful punishers to problem behavior (i.e.,
modifying behavior) can effectively treat problem behavior, so
why bother conducting a functional assessment at all? There
are practical reasons; doing so increases treatment precision
and effcacy. In other words, doing so identifes treatments that
work and that can be practically implemented (as illustrated in
Carr & Durand, 1985; Iwata, Pace, Cowdery, & Miltenberger,
1994; Meyer, 1999; Newcomer & Lewis, 2004; Taylor &
Miller, 1997). There is an equally important humanistic reason
for doing so; conducting a functional assessment dignifes the
treatment development process by essentially “asking” the
person why he or she is engaging in problem behavior prior
to developing a treatment. Behavior modifcation, or program-
ming powerful but arbitrary reinforcers and punishers without
frst recognizing the unique history of the person being served
or the prevailing contingencies he or she is experiencing, is
somewhat inconsiderate. It is like saying, “I don’t know why
you have been behaving in that extraordinary manner, but it
does not matter because I can change your behavior. . .” By
contrast, a behavior analytic approach, with functional assess-
ment at its core, essentially communicates: “I don’t know why
you have been behaving in that extraordinary manner, but I
will take some time to fnd out why and incorporate those fac-
tors into all attempts to change your behavior.”

To drive this point home, let’s do some perspective tak-
ing. Imagine that you experienced some temporary muscle
paralysis that does not allow you to talk, write, or engage in
controlled motor movements. You are now hospitalized and
on several medications that have the common side effect of
drying out your eyes, nose, skin, and, especially your mouth.
Water is viewable on the rolling table, but unattainable due to
your lack of dexterity. You learn that if you bang the bed rails
with the back of your hands long enough and loud enough,
people will come to you and do things for you, like turning the
television on or off or fuffng your pillows, or give you things,
one of which is the water that you desperately need. Due to its
functionality, the banging continues to such an extent that the
backs of your hands are bruised and your care providers an-
noyed. The consulting behavior modifer shows up and recom-
mends a program of contingent restraint with Posey® mitts “to
ensure your safety” and access to music and some Skittles when
you are not banging. Your problem behavior occurs much less
frequently. It doesn’t go away, but your bruises are healing, and
the staff is certainly less annoyed with you. Job well done by the
behavior modifer? I doubt you think so.

If there were a process available to allow your care provid-
ers to know the simple reason why you were hurting yourself
and annoying them, wouldn’t you want it employed? Wouldn’t
it have been nice to just be able to push a button that requested
assistance obtaining water at any given moment (or perhaps
simply have access to a long straw!)? The functional assessment
process makes these humane and practical outcomes pos-
sible. So let’s return to the earlier question of why conduct a
functional assessment and provide a better answer: Behavior
analysts should do it to identify effective, precise, personally
relevant, and humane treatments for problem behavior (see
Hanley, 2010 & 2011, for additional reasons for conducting
analyses).

Defning the Parts of the Process

Before I discuss some myths and isolate some good practices
regarding the functional assessment process, it is important to
defne the three main types of functional assessment. With an
indirect assessment, there is no direct observation of behavior; in-
direct assessments take the form of rating scales, questionnaires,
and interviews (e.g., Durand & Crimmins, 1985; Paclawskyj,
Matson, Rush, Smalls, & Vollmer, 2000). With a descriptive
assessment, 1 there is direct observation of behavior, but without
any manipulation of the environmental conditions (Bijou,
Peterson, & Ault, 1968; Lalli, Browder, Mace, & Brown, 1993;
Lerman & Iwata, 1993; Mace & Lalli, 1991; Sasso et al., 1992;

1Because there is no manipulation of the environment when a descriptive
assessment is conducted, the term descriptive assessment, and not descriptive
analysis, is used here because as Baer, Wolf, and Risley (1968) noted, “a non-
experimental analysis is a contradiction in terms” (p. 92).

PERSPECTIVES 55

erspectives

Vollmer, Borrero, Wright, Van Camp, & Lalli, 2001). This is
the “fy on the wall assessment,” which takes multiple forms like
A-B-C recording and narrative recording (Bijou et al.). With a
functional analysis, 2 there is direct observation of behavior and
manipulation of some environmental event (see Iwata, Dorsey,
Slifer, Bauman, & Richman, 1982/1994, for the seminal ex-
ample; see Hanley et al., 2003, for an expanded defnition and a
review of these procedures). These three types are all functional
assessments; the term functional analysis is employed only when
some aspect of the environment is systematically altered while
problem behavior is being directly observed.

Reconsidering the General Approach
to Functional Assessment

The necessity or utility of a least restrictive hierarchical
approach to conducting functional assessment has not been
proven, although it is apparent in practice and described
(Mueller & Nkosi, 2006; O’Neill, Horner, Albin, & Storey,
1997) or implied (Iwata & Dozier, 2008; McComas & Mace,
2000) in book chapters or discussion articles regarding the
functional assessment of severe problem behavior. The myth
goes something like this: Start the functional assessment process
with an indirect assessment. If you are not confdent in the results,
conduct a descriptive assessment. If you still have competing hy-
potheses regarding the variables controlling behavior, then conduct
a standard functional analysis. Like all things based on a least
effort hierarchy, this process has intuitive appeal, but there are
several reasons why behavior analysts should reconsider their
commitment to this assessment hierarchy. The frst is that
closed-ended indirect assessments (e.g., Motivation Assessment
Scale [MAS], Questions About Behavior Function [QABF]) are
notoriously unreliable; when two people who have a history
with the person engaging in problem behavior are asked to
complete a rating scale, analyses of their responses usually
yield different behavioral functions (see Newton & Sturmey,
1991; Nicholson, Konstantinidi, & Furniss, 2006; Shogren &
Rojahn, 2003; Zarcone, Rodgers, Iwata, Rourke, & Dorsey,
1991 for some analysis of the reliability of closed-ended indirect
assessments; see Hanley, 2010, for a more in-depth discussion
of the reliability of these instruments). Without reliability,

2I prefer the term functional analysis to experimental analysis and to experimen-
tal functional analysis in both practice and in science in general because of
the very diferent efects “function” and “experimental” have on the listener.
Function can be understood in a mathematical sense, but more importantly,
it also conveys the operant or adaptive nature of the response being analyzed,
which has obvious importance in the context of behavioral assessment (see
Hanley et al., 2003; and Hineline & Groeling, 2010). Te term experimental
does not convey this latter meaning, and instead erroneously conveys that
the procedures being implemented are in a sort of trial phase, awaiting a
proper analysis of their utility, as in an experimental medication. In addi-
tion, considering the quote from Baer et al. included in the footnote above,
experimental analysis is redundant.

there is no validity, meaning that there is no opportunity to
determine whether the function of behavior is correct from
these instruments. Closed-ended indirect assessments are likely
preferred because quantifable results can be obtained quickly,
and documentation regarding behavior function is created and
can be easily fled or shared at an interdisciplinary meeting.
Behavior analysts can probably save a little time and be no
worse off by simply omitting closed-ended indirect assessments
from the functional assessment process.

At the start of the functional assessment process, behavior
analysts should indeed talk to the people who have most often
interacted with the person engaging in the problem behavior.
But, instead of presenting generic scenarios and asking for nu-
merical or yes/no answers (i.e., the substance of closed-ended
assessments), the behavior analyst should ask questions that
allow caregivers and teachers to describe in detail what happens
before and after severe problem behavior occurs. These sorts
of interviews are known as semistructured and open-ended
interviews. The appendix at the end of this article contains an
example of this sort of interview that allows behavior analysts
to discover common, as well as unique, variables that may
evoke or maintain problem behavior. Because of the likely
unreliability of interviews, including the one in the appendix,
treatments should typically not be designed based solely on the
results of these interviews; instead, functional analyses are to be
designed from the interview results. An open-ended interview
allows for behavior analysts to discover prevalent variables that
may be further examined and possibly demonstrated as impor-
tant via functional analyses. An important thing to consider is
that careful open-ended interviewing used to be the norm prior
to conducting functional analyses (see Iwata, Wong, Riordan,
Dorsey, & Lau, 1982).

3

The second reason the least restrictive assessment hierarchy
is troublesome is due to its reliance on descriptive assessment
to determine behavioral function. I have yet to come across a
study showing that the exclusive results of a descriptive assess-
ment were useful for designing a treatment for severe problem
behavior. This is likely related to the fact that descriptive assess-
ments are notoriously invalid for detecting behavioral function
(St. Peter et al., 2005; Thompson & Iwata, 2007). Why might
this be? The fact that most people will attend to someone who
just kicked them or to someone who makes a jarring sound
when they bang their head on a wall leads to most descriptive
assessments suggesting that attention is a possible reinforcer
for severe problem behavior (McKerchar & Thompson, 2004;
Thompson & Iwata, 2001). But studies that have compiled

3Tere are multiple articles that describe conducting an open-ended interview
prior to conducting the functional analysis, but the interview appears to only
inform the topography of the behavior targeted in the analyses because the
analyses in these same studies are all standardized (i.e., including the same
test and omnibus control conditions).

PERSPECTIVES 56

data on the prevalence of behavioral function show that atten-
tion maintains problem behavior in only about one quarter to
one third of the cases examined (Derby et al., 1992; Hanley et
al., 2003; Iwata, Pace, Dorsey et al., 1994). The lack of cor-
respondence between descriptive assessments and functional
analyses is often due to these false-positive outcomes regarding
attention (see Thompson & Iwata, 2007).

Consider also that most teachers and parents learn to avoid
the presentation of events that evoke negatively reinforced
problem behavior (Carr, Taylor, & Robinson, 1991; Gunter
et al., 1994); perhaps this leads to the likely false negative out-
comes regarding behavior maintained by escape. For instance,

The literature has shown that descriptive
assessments are good at teaching us about
the prevalence of the environmental events
occurring before and after problem behav-
ior, but that we need to conduct functional
analyses to learn about the relevance of
those events for the severe problem behav-
ior we are charged with understanding

if the teacher has learned that diffcult math evokes dangerous
behavior, the teacher is not likely to present diffcult math
to the student while the behavior analyst is conducting the
descriptive assessment. Furthermore, it is unclear how auto-
matic reinforcement is to be detected and differentiated from
socially mediated problem behavior via descriptive assessments
(e.g., nonmediated sensory reinforcers cannot be detected and
recorded).

The literature has shown that descriptive assessments are
good at teaching us about the prevalence of the environmental
events occurring before and after problem behavior (McKerchar
& Thompson, 2004; Thompson & Iwata, 2001), but that we
need to conduct functional analyses to learn about the relevance
of those events for the severe problem behavior we are charged
with understanding. Therefore, behavior analysts can save a lot
of time and be no worse off by simply omitting formal, lengthy,
and especially closed-ended descriptive assessments from their
functional assessment process. Brief and open-ended observa-
tions may be useful for refning operational defnitions of the
problem behavior or for detecting possible unique antecedent

or consequent events to examine in a functional analysis, and
they may be especially useful if the interview does not yield
unique information for designing the analysis.

The third reason the common hierarchy is troublesome is
due to its reliance on a standard functional analysis. By stan-
dard, I am referring to the rapid alternation of four conditions
in a multielement design with tests for all generic contingencies
(i.e., an attention test condition, an escape test condition, and
an alone condition testing for maintenance via automatic rein-
forcement) and an omnibus control condition usually referred
to as the play condition (Iwata, et al., 1982/1994). Simply put,
there is no standard analysis; a functional analysis of problem

behavior simply involves the direct observation
of behavior while some event suspected of being
related to problem behavior is manipulated. Note
that this widely agreed upon defnition of a func-
tional analysis does not specify where the analysis
takes place (e.g., in a 3 m by 3 m therapy room or in
a busy classroom) or who will conduct the analysis.
More important is that it does not specify how
many test conditions to include or any particular
type of control condition (e.g., the omnibus play
condition is not mandatory). These are decisions
to be made based on the many factors that will
become evident during an open-ended interview.

For instance, if the results of the interview
show that one child’s loud moaning and hand
fapping occur under most conditions and seem to
occur irrespective of the social environment, con-
ducting a series of alone sessions frst to see if the
problem behavior persists in the absence of social

consequences is a good idea. By contrast, if the results of the
interview show that another child’s tantrums most often occur
when the teacher removes toys from the child during free play,
then two conditions should be conducted, with the access to
the toys provided contingent on tantrums in one condition and
perhaps uninterrupted access to toys arranged in the second
condition. The former condition is known as the test condi-
tion because the contingency thought to maintain problem
behavior is present, whereas the latter condition is referred to
as the control condition because the contingency thought to
maintain problem behavior is absent.

The point being made with these examples is that behavior
analysts should consider asking simple questions about the
variables most likely infuencing problem behavior and test-
ing the ones that seem to be most important frst. By testing
one hunch at a time, more careful control conditions can be
designed in which only the contingency differs between test
and control conditions. The interested reader is directed to
Thompson and Iwata (2005) for a thorough discussion of the
importance of properly designing control conditions. If the

PERSPECTIVES 57

5

4

3

2

1

0

Test Condition
Control Condition

1 2 3 4 5 6

Sessions

Pr
ob

le
m

B
eh

av
io

rs
p

er
m

in

erspectives
hunch from the interview or observation is affrmed in this
initial functional analysis, then the behavior analyst will have
a stable and sensitive baseline from which to assess the effects
of a function-based treatment. Examples of this approach in
which results of open-ended interviews informed the design
of analyses involving a single test condition and an intimately
matched control condition can be found in Hanley, Iwata, and
Thompson (2001).

More questions regarding other factors possibly infuenc-
ing problem behavior can be asked separately and as often as
there are still questions about that which is infuencing prob-
lem behavior. In essence, there is no mandate that all questions
be asked in a single analysis (e.g., in the analysis format frst
reported by Iwata et al., 1982/1994). It is equally important
to consider that there is no single analysis that can answer all
questions about the environmental determinants of problem
behavior. Even comprehensive analyses such as that initially
described by Iwata et al. (1982/1994) are incomplete in that
these analyses do not test all possible contingencies that may
infuence problem behavior. The main strength of a functional-
analytic approach is that the analysis is fexible and can be
individualized. Although this set of assertions awaits empirical
validation, it seems likely that the probability of differentiated
analyses will be strongest when more precise and personalized
analyses are conducted based on the results of semistructured,
open-ended interviewing. I suggest the following for consid-
eration as practitioner lore regarding the general functional
assessment process: Start with a structured, but open-ended,
interview and a brief observation to discover potential factors that
may be infuencing problem behavior, and then conduct a precise
and individualized functional analysis based on the resultant
information to examine the relevance of those discoveries.

Overcoming Common Obstacles to Conducting
a

Functional Analysis

The importance of the open-ended interview (e.g., Iwata et
al., 1982), especially for informing the design of the functional
analysis, seems to have been passively overlooked in behavior-
analytic practice, whereas the functional analysis (Iwata et al.,
1982/1994) appears to be more actively avoided in practice
(Desrochers, Hile, & Williams-Mosely, 1997; Ellingson,
Miltenberger, & Long, 1999; O’Neill & Johnson, 2000;
Weber, Killu, Derby, & Barretto, 2005). Behavior analysts
who are charged with treating severe problem behavior but
who do not conduct functional analyses are quick to provide
multiple reasons why they do not conduct analyses. These
reasons may have had merit in the past, but our research base
regarding functional analysis has grown tremendously (Hanley
et al., 2003; see JABA Special Issue on functional analysis, 2013,
volume 46, issue 1). With this growth, solutions for common

Figure. An example of graphically depicted data from a
functional analysis. Note the presence of only two conditions;
one in which a contingency thought to maintain problem
behavior is present (test) and one in which the contingency is
absent (control).

and seemingly insurmountable obstacles have been discovered,
properly vetted, and await adoption by those who would
beneft from an understanding of problem behavior prior to
its treatment—behavior analysts and the people they serve.
Tables 1 and 2 provide a summary of the available solutions
in the context of general and client-specifc obstacles. Some
references for the empirically derived solutions for overcoming
the oft-stated obstacles to conducting functional analyses and
accompanying rationales follow.

Implementation Obstacle 1: Functional Analyses Take
Too Much Time

Multiple researchers have proven the effcacy of several
timesaving methods relevant to functional analysis. Wallace and
Iwata (1999) showed that 5- and 10-min sessions are as valid
as longer sessions. Iwata, Duncan, Zarcone, Lerman, and Shore
(1994) showed us how to trim our designs to include only two
conditions. Considering only these adjustments, a functional
analysis can take as little as 30 min to complete (three 5-min test
sessions and three 5-min control sessions; see Figure). Sigafoos
and Saggers (1995), Wallace and Knights (2003), and Bloom,
Iwata, Fritz, Roscoe, and Carreau (2011) described trial-based
analyses in which test and matched control conditions occur for
a maximum of one-min each. Thomason-Sassi, Iwata, Neidert,
and Roscoe (2011) showed that sessions could be terminated
after a single response and that measurement of the latency
to the frst response can be sensitive to typical contingencies

PERSPECTIVES 58

arranged in a functional analysis. In short, functional analyses
need not require a lot of time.4

It is also important to consider the chief alternative to a
functional analysis and that is to rely on a descriptive assess-
ment that often yields spurious correlations as opposed to the
more compelling functional relations derived from a functional
analysis. Descriptive assessments often take a long time to
complete because observers have to wait for problem behavior
to occur in uncontrolled environments in which the establish-
ing operation for the problem behavior may or may not be
presented (and because there is no obvious criterion for termi-
nating a descriptive assessment). In addition, considerable time
and expertise is required to collect a suffcient sample of data to
analyze and to undertake the increasingly complicated quanti-
tative analyses necessary to depict and begin to understand the
data yield via descriptive assessments (e.g., Emerson, Reeves,
Thompson, & Henderson, 1996). These efforts certainly take
more time than that required to conduct six brief observations
of problem behavior in properly informed test and control
conditions comprising an analysis.

Implementation Obstacle 2: Functional Analyses Are
Too Complex

The functional assessment and treatment development
process is complex, but functional analyses are less so, espe-
cially for anyone with training in behavior analysis. Iwata et
al. (2000) showed that undergraduates could accurately imple-
ment common analysis conditions after two hours of training.
Similar effects were shown by Moore et al. (2002). Hagopian
et al. (1997) provided a set of rules that aid in the accurate
visual analysis and interpretation of functional analysis data. In
short, implementing the procedures and interpreting the data
of functional analyses is possible with a little training. There are
no equivalent studies teaching people how to conduct a proper
descriptive assessment or how to analyze or effectively interpret
the data resulting from descriptive assessment as they relate to
detecting behavioral function.

If you, as a behavior analyst, are still not confdent you
can conduct functional analyses, consider the following logic.
Establishing a baseline of problem behavior from which to
determine whether a given treatment is effective is essential
in behavior-analytic practice (Behavior Analyst Certifcation
Board, 2012). Problem behavior must occur with some regu-
larity in baseline to detect the effects of treatment. Regularly
occurring problem behavior will only be observed if the
4I do not recommend any sort of brief functional analysis that involves con-
ducting only one of each test condition (e.g., Northup et al., 1991) because
necessary replication of test and control conditions is distinctly absent from
these analyses. I recommend the tactics described above because they retain
design features that allow for replication of suspected relations, the key ele-
ment for believing in conclusions regarding the function of behavior.

controlling contingency is present in that baseline (Worsdell,
Iwata, Conners, Kahng, & Thompson, 2000); if that is the
case, you essentially have created a functional analysis test
condition. By arranging a second condition in which the
controlling contingency for problem behavior is absent (i.e.,
the reinforcer is provided according to a time-based schedule
or for an alternative behavior, or withheld for all responding),
you essentially have created a functional analysis involving
a test condition and a control condition. In other words, if
you are capable of changing some aspect of the environment
and determining the effects of that single change on a direct
measurement of problem behavior, which is what all behavior
analysts are trained to do when evaluating a treatment, then
you can indeed conduct a functional analysis.

Implementation Obstacle 3: Functional Analyses Are Too Risky
for the Client or for the Person Conducting the Analysis

When considering risk, the main question to be asked is
will the child or client be at greater risk in the analysis than that
which they normally experience during the day? Put another
way, will their problem behavior be more dangerous or intense
in or outside of the analysis? This question is often best dis-
cussed with other professionals, especially medical profession-
als, if the problem behavior is self-injurious (see the description
of human subject protections from Iwata et al., 1982/1994).
Important information for such a discussion is that a properly
designed functional analysis will almost always result in problem
behavior that is of lower intensity than that observed outside
of the analytic context. This is the case because best practices
regarding functional analysis emphasize the inclusion of clearly
signaled contingencies, continuous reinforcement schedules,
and inclusion of problem behaviors in the contingency class
that are safe for the client to emit (Hanley et al., 2003). These
tactics typically result in more quickly discriminated problem
behavior and overall decreases in the intensity and often the
frequency of severe problem behavior in the analysis.

Risk is increased by certain tactics that may be adopted
when conducting an analysis, such as not programming dif-
ferential consequences in an analysis (Carr & Durand, 1985)
or arranging consequences in your functional analysis on
intermittent reinforcement schedules deduced from descriptive
assessments (Mace, 1994; Mace & Lalli, 1991). The problem
with both tactics is that higher rates and intensities of problem
behavior are almost guaranteed if you do not provide the puta-
tive reinforcer for each and every problem behavior in your
analysis.

Riskier alternatives to conducting an informed analysis,
as have been described thus far, are to extend assessment time
indefnitely by relying exclusively on descriptive assessment or
to design treatments based on ambiguous outcomes associated

PERSPECTIVES 59

erspectives
with closed-ended indirect and descriptive assessments. Under
these conditions, delayed and ineffective treatments are likely,
resulting in the continuation of problem behavior, which is
perhaps the greatest risk of all.

Implementation Obstacle 4: Functional Analyses Are Diffcult
to “Sell” to Constituents

Functional analyses of severe problem behavior probably
do not make much sense to a parent or teacher the frst time
they are described to them. For instance, it must seem quite
counterintuitive to allow someone to set up conditions that
will seemingly worsen a child’s self-injury. It is certainly more
intuitive and more immediately agreeable to caregivers and
teachers if we only ask questions about the person who is
engaging in the problem behavior and/or watch the child in
the classroom or at home to fnd out why the child is engaging

understand what you are doing with a functional analysis. Like
many people who routinely implement functional analyses, I
have found the allergy test analogy especially helpful.

When you see a medical specialist for incapacitating al-
lergies, she will frst ask you a set of questions through which
she is trying to narrow down the population of stimuli to
which you might be allergic. The allergist will then use dif-
ferent needles to poke you with different possible allergens to
see which ones will make your skin worsen a little bit (e.g.,
some redness and slight infammation may occur on the site
that was poked). Most allergists will also administer a control
poke, just the needle with saline, no allergen, to be certain that
the worsening is a function of the specifc allergen. Really good
allergists will provide multiple pokes of the same allergen and
of saline or will repeat a smaller version of the test to ensure
they got it right before they recommend a particular course of

treatment. Sound familiar? This is what behavior
analysts are essentially doing when they conduct a
functional analysis. They are testing to see which
environmental condition will give rise to a toler-
able, slight, and short-lived worsening of problem
behavior, and they will repeat the test until they
are confdent in the environmental conditions that
are giving rise to the debilitating problem behavior.
When the test is positive for some environmental
event, we have a better understanding of the
problem, which leads to more precise and practical
treatment.

As noted initially, allergists usually interview
people prior to poking them with needles, which
underscores the importance of our asking questions
prior to conducting a functional analysis, but aller-
gists won’t observe you for several hours (or more)
recording the environmental correlates of allergic
responses and base a treatment on those observed
correlates (i.e., they don’t do time-intensive descrip-

tive assessments). Behavior analysts’ time is just as important as
that of any medical specialist.

In sum, to obtain proper buy-in from constituents of
functional analysis: (a) build a therapeutic relationship during
the interview process, (b) describe the practical and humane
reasons for understanding the function prior to treating prob-
lem behavior, (c) describe how reinforcement-based treatments
are more likely following a proper functional analysis (Pelios,
Morren, Tesch, & Axelrod, 1999), (d) use analogies to explain
why you are doing a functional analysis, (e) emulate the con-
ditions they described in the interview as being important to
problem behavior in your analysis so the connection between
the two is apparent, and (f ) adopt the previous suggestions for
decreasing the assessment length and consider those articulated
next for increasing the safety of the analysis.

Conducting a functional analysis, which

essentially requires reinforcement of

problem behavior, is indeed counterintuitive

and unexpected by our constituents;

but, the process is not without precedent

in our culture

in severe problem behavior. Conducting a functional analysis,
which essentially requires reinforcement of problem behavior,
is indeed counterintuitive and unexpected by our constituents;
but, the process is not without precedent in our culture.

How should a behavior analyst proceed to obtain suf-
fcient buy-in and necessary consent for this evidence-based
process? First, the behavior analyst should build a therapeutic
relationship with the parents and teachers. This relationship
starts to develop during the open-ended interview and while
making casual observations. Showing up, asking questions, and
observing sends the important message that you need to learn a
few things and that they have some answers. Once assessment
partners (i.e., the parents and teachers) have a chance to talk
and teach you about what the problem behavior is and the fac-
tors associated with it, you can then use analogies to help them

PERSPECTIVES 60

Implementation Obstacle 5: Functional Analyses Can’t Be
Used for Dangerous Problem Behavior

The importance of creating analytic contexts that are safe
for both the child and the analyst is paramount, and doing
so is often seen as an insurmountable obstacle. This dilemma
seems to be responsible for many behavior analysts’ assertion
that they are willing to conduct analyses as long as the problem
behavior is not dangerous. Something to consider is that the
more dangerous the behavior, the more important it becomes
to accurately determine behavioral function so that a precise
and effective treatment can be prescribed as soon as possible.

The frst thing to consider is the assessment context. Soft
toys should be included for children who are reported to break
or throw toys. Padded tables should be included for children
who are reported to bang the table with limbs or their head
during instructional periods. If aggression is being analyzed, the
analyst should wear protective equipment under their clothes
so that they can implement the differential contingencies with
fdelity while maintaining his or her own safety.

The analyst should remember next that proper scheduling
of putative reinforcers in the test and control conditions will
create safe environments for themselves and the person whose
behavior is being analyzed. More specifcally, providing a
particular consequence for every instance of problem behavior
in the test condition, and doing so immediately following
each instance, will usually result in an immediate decrease in
the intensity of problem behavior. Arranging for the putative
reinforcer to be available for free and often or available for an
alternative response that has a decent probability of occurring
will increase the likelihood of a safe and manageable control
condition. Arranging extinction as the control condition is
likely to result in an unsafe condition because the continuation
of the establishing operation is likely to result in either a burst
of problem behavior or at least intermittent occurrences of the
problem behavior in the control condition.

The third set of considerations to address dangerous behav-
ior has already been described because they are the same tactics
available for decreasing the overall analysis duration. Consider
an analysis with only two conditions (test and control; e.g.,
Hanley et al., 2010) while using brief sessions (Wallace &
Iwata, 1999), trial-based (Bloom et al., 2011), or latency-based
analyses (Thomason-Sassi et al., 2011), all of which will shorten
the time in analysis and the number of responses allowed to
occur.

The fourth and perhaps most important consideration
pertains to the decision as to which behaviors will be scheduled
to receive the putative reinforcers. Very dangerous or intoler-
able behavior need not be the problem behavior reinforced in
the analysis. To accommodate the dangerousness of problem

behavior, Smith and Churchill (2002) demonstrated the effcacy
of conducting functional analyses of precursor behaviors that
were reported to reliably precede dangerous problem behavior
to identify its function. Precursors are behaviors that we can
tolerate more (e.g., pushing materials away) and that reliably
precede or cluster with the more dangerous or less tolerable
problem behavior (e.g., face punching or directed spitting,
see Fahmie & Iwata, 2011). In essence, both the precursor
and more dangerous behavior are measured, but the putative
reinforcers are only provided following the precursor behavior
in the test condition of the functional analysis. If a difference
between test and control conditions is observed, a small infer-
ential leap is made by concluding that the variable maintaining
the precursor behavior must also be maintaining the more
dangerous behavior. Behavior analysts can identify precursors
in an open-ended interview by asking what the child usually
does before she is aggressive or what other behaviors occur
during aggression (see Herscovitch, Roscoe, Libby, Bourret, &
Ahearn, 2009). Caregivers may not be able to routinely identify
reinforcers for problem behavior accurately—most people who
are not trained in behavior analysis do not see behavior through
the lens of a contingency—but they are adept at reporting pat-
terns and sequences of behavior (see for examples, Smith &
Churchill; Herscovitch et al.). When discovered, this informa-
tion can assist the behavior analyst in designing an effcient and
safe analysis of dangerous behavior.

Implementation Obstacle 6: Functional Analyses Can’t
Address Low-Rate Problem Behavior

A conceptually systematic interpretation of low-rate prob-
lem behavior is that the environmental events that establish the
value of the reinforcer for problem behavior are insuffciently
present. Because putative establishing operations are repeatedly
arranged in functional analyses, differentiated analyses can be
obtained even for reportedly low-rate problem behavior. When
the strong contingencies in functional analyses fail to evoke
problem behavior in the analysis (this will look like near-zero
responding in both test and control conditions), functional
analysis session lengths can be extended or the timing as to
when to conduct the analyses can be optimized to detect be-
havioral function. As an example of the former, Kahng,
Abt, and Schonbachler (2001) observed no aggression by an
adolescent with intellectual disabilities in an initial functional
analysis based on 10-min sessions. Analysis conditions were
then extended such that a single condition was implemented for
about 8 hours each day. An attention function of this low-rate
problem behavior was detected during the extended analysis,
and an effective function-based treatment was designed. As an
example of the latter, Tarbox, Wallace, Tarbox, Landaburu, and
Williams (2004) identifed the function of low-rate problem

PERSPECTIVES 61

erspectives
behavior by initiating functional analysis sessions whenever
problem behavior was observed to occur. They also showed that
treatments designed from their “opportunity-based” analysis
were effective.

Low-rate problem behavior may also be a function of not
including the relevant establishing operation or type of rein-
forcer. Open-ended interviewing or observations have proven
useful for identifying idiosyncratic aspects of contingencies in-
fuencing problem behavior. For instance, through open-ended
assessment, Fisher, Adelinis, Thompson, Worsdell, and Zarcone
(1998) discovered that instructions evoked problem behavior
only when issued during highly preferred activities such as
watching game shows or engaging in gross motor activities.
Functional analyses then demonstrated the infuence of these
complex contingencies (for more examples involving the effec-
tive and necessary use of open-ended assessment as a response
to low-rate problem behavior and undifferentiated analyses, see
Bowman, Fisher, Thompson, & Piazza, 1997; Fisher, Lindauer,
Alterson, & Thompson, 1998; Thompson, Fisher, Piazza, &
Kuhn, 1998; Tiger, Hanley, & Bessette, 2006).

Implementation Obstacle 7: Functional Analyses Can’t
Address Covert Problem Behavior

By covert problem behavior, I am not referring to a situa-
tion in which someone is thinking about engaging in problem
behavior; I am referring to conditions in which the problem
behavior rarely or never occurs in the presence of others. When
confronted with covert behavior of this sort, it would seem
impossible to conduct a functional analysis because an analyst
would never have the opportunity to provide or withhold the
putative reinforcers in test and control conditions. Nevertheless,
examples of functional analysis applied to covert behavior exist.
For instance, while trying to understand why a young man with
developmental disabilities would engage in life-threatening pill
ingestion, Chapman, Fisher, Piazza, and Kurtz (1993) baited
an empty room with pill bottles and provided different relevant
consequences for ingesting inert pills from the different colored
bottles (e.g., medical attention from the blue bottle, escape from
school from the red bottle). These authors found a negatively
reinforcing function of covert pill ingestion and their informed
treatment reduced pill ingestion to near-zero levels.

Piazza, Hanley, and Fisher (1996) also used a baited room
strategy to detect the variables infuencing the covert cigarette
pica of a young man with autism. Additional manipulations
of the content of the cigarettes revealed that the nicotine was
the likely automatic source of reinforcement for his problem
behavior; treatment based on this understanding was successful
in reducing this problem behavior.

Grace, Thompson, and Fisher (1996) were challenged with
understanding a young man’s low-rate, high-intensity self-injury

that resulted in torn eyelids and wounds requiring stitches and
that only occurred while no one was watching. To infer the
reinforcers for the covert self-injury, these authors designed an
analysis to detect the reinforcing value of different material and
social reinforcers (e.g., medical attention) for an arbitrary re-
sponse of stuffng envelopes. They found that adult interaction
was a reinforcer. Covert self-injury was eliminated when high-
quality attention was provided for the absence of the products
(e.g., wounds) of the young man’s self-injury. Their analysis was
similar to a functional analysis in that the reinforcers analyzed
were those that were thought to maintain the covert self-injury;
their analysis was unique in that reinforcement sensitivity was
assessed on responses that were not problematic. As in the case
of precursor analyses (Smith & Churchill, 2002), a small infer-
ential leap is required to determine behavior function with this
sort of reinforcer analysis. Nevertheless, these studies show that
obstacles based on response topography are surmountable.

Implementation Obstacle 8: Functional Analyses Can’t Address
Multiple Topographies or Functions of Problem Behavior

It is probably true that the odds of an undifferentiated
analysis are likely to increase as the number of topographically
distinct members that are available to receive the putative rein-
forcer increase in an analysis. Restricting the class of behaviors
that are reinforced in the analysis may be good practice (Hanley
et al., 2003), but it does imply that multiple distinct analyses
are required if the goal is to determine the function of multiple
topographies of problem behavior. If you do include several
topographies in the contingency class, Magee and Ellis (2000)
showed how the systematic arrangement of extinction for ad-
ditional topographies could provide information as to which
ones are maintained by the same reinforcer.

Furthermore, if a behavior analyst suspects that the same
topography of problem behavior is sensitive to multiple rein-
forcers, confdent determinations of function can be made by
arranging different test and control comparisons in sequence
or by applying the tactic of affrming the consequent (Sidman,
1960) as was done by Smith, Iwata, Vollmer, and Zarcone
(1993). These authors arranged for various function-based
treatments following high levels of responding in different test
conditions as a means of affrming whether or not different
reinforcers infuenced problem behavior.

Implementation Obstacle 9: Functional Analyses Can’t Address
Problem Behavior Infuenced by Constantly Changing Reinforcers

For some children, and perhaps especially those with
diagnoses of autism, it seems that the reinforcers for severe
problem behavior are continually changing. The static nature
of the functional analysis test condition, in which a single
reinforcer type is established and delivered following problem

PERSPECTIVES 62

To address concerns regarding

. . . the time required to conduct
an analysis

Consider

• scheduling brief (5-min) sessions

• conducting an analysis informed by an open-ended interview consisting of
only a single test condition and intimately matched control condition

• implementing trial-based analyses

• implementing latency-based analyses

• conducting an analysis informed by an open-ended interview consisting of
only a single test condition and intimately matched control condition

. . . the complexity of an analysis

• building a therapeutic relationship with parents and teachers via open-ended
interviewing

• describing the practical and humane reasons for understanding function prior
to treating problem behavior

• describing how reinforcement-based treatments are more likely following a
proper functional analysis

• using analogies to explain the logic and acceptable risks inherent in a properly
designed functional analysis

• emulating the conditions they described as being important to problem
behavior in your analysis

• adopting the tactics for decreasing the assessment length and for increasing
the safety of the analysis

. . . the diffculty “selling” the
analysis to constituents

• conducting the analysis in an environment that allows for the problem behav-
ior to occur safely

• including clearly signaled contingencies and continuous schedules of pro-
grammed consequences in test conditions

• scheduling brief (5-min) sessions

• conducting an analysis informed by an open-ended interview consisting of
only a single test condition and intimately matched control condition

• implementing trial-based analyses

• implementing latency-based analyses

• arranging for putative reinforcers to only be provided for precursors to the
dangerous behavior in the test condition

. . . the danger to the client
and person conducting the analysis

Table 1. Tactics to Overcome General Obstacles to Conducting Functional Analyses

PERSPECTIVES 63

erspectives
behavior, seems ill-suited to understand the determinants of
problem behavior for these children. Bowman et al. (1997)
described an analysis method for these situations with two
boys with pervasive developmental disorder. Open-ended
assessment suggested the children engaged in severe problem
behavior when the parent did not comply with their requests,
and the requests made by these boys were varied, frequent, and
sometimes extraordinary. Problem behavior was observed at
high rates in a test condition when the therapist complied with
the child’s requests only following severe problem behavior; it
was observed at low rates when the therapist complied with
all requests immediately. This analysis capitalized on the fact
that the various events that were momentarily reinforcing and
whose absence would evoke severe problem behavior were
specifed by each child. Successful function-based treatments
can be designed for these children by teaching them which type
of requests will be reinforced and when their requests will be
reinforced and by not reinforcing requests following problem
behavior.

Avoid Undifferentiated Analyses by Incorporating
Their Solutions in the First Analysis

Undifferentiated analyses occur infrequently (less than
5%) in the published literature (Hanley et al., 2003), partly
because it is diffcult to publish empirical data that provide no
information. Published examples of initially undifferentiated
analyses do exist, however (e.g., Bowman et al., 1997; Fisher
et al., 1998; Thompson et al., 1998; Tiger et al., 2006). A
recent analysis of data from a leading institution in functional
analysis research and practice showed that initial analyses are
undifferentiated about 50% of the time (Rooker, Hagopian,
DeLeon, & Jessel, in press). This analysis also illustrated the
iterative nature of the functional assessment process with 87%
of analyses differentiated by the second or third attempt, with
these attempts involving procedural adjustments to the analy-
sis. Adjustments made to the undifferentiated analyses in the
published literature cited above and from Rooker et al. clarifed
the initially ambiguous results. These procedural modifcations
can be classifed according to which aspects of the assessed
contingency are altered.

One class of modifcations involves making changes to
the type of the reinforcer manipulated across test and control
conditions (e.g., incorporating a more specifc physical type of
attention or a unique tangible item). A second class involves
making changes to the events that are likely to establish the
value of the reinforcer for problem behavior (e.g., having the
adult engage in a conversation with another adult as opposed
to merely diverting their eyes to a magazine; having caregivers
serve as therapist [Ringdahl & Sellers, 2000]; changing the

type instructions provided). A third class of modifcations in-
volves adding or simplifying the events that signal the presence
or absence of the contingency infuencing problem behavior
(adding condition-correlated stimuli [Conners et al., 2000], or
reducing the number of rapidly alternating conditions from 5
to 2). A fourth class of modifcations involves the introduction
of an entirely new and unique contingency (i.e., changing both
the EO and reinforcer) as in the provision of a requested event
following problem behavior in Bowman et al. (1997).

These changes are primarily directed toward the test condi-
tions; however, analyses may also be clarifed by redesigning
the control condition. Using a noncontingent reinforcement
control condition as opposed to an extinction control condi-
tion, incorporating a denser schedule of noncontingent rein-
forcement, or perhaps omitting noncontingent reinforcers that
follow close in time to problem behavior may result in lower
levels of problem behavior in the control condition and hence
result in a differentiated analysis.

All of the tactics described thus far may clarify an initially
undifferentiated analysis, but a reasonable question to ask is:
Why wait for an undifferentiated analysis to employ these
tactics? Why not consider them with the initial design of
a functional analysis? The point here is that we may not be
analyzing severe problem behavior as effciently as we could be
when we standardize a powerful idiographic assessment such as
the functional analysis. In fact, the fexibility of the functional
analysis was evident in the Iwata et al. (1994) review, which
included an escape-from-noise test condition. When frst de-
signing a functional analysis, the practitioner should consider
the following tactics:

• Conduct a thorough open-ended interview and brief ob-
servation to discover ecologically valid and unique con-
trolling variables, and allow this information to inform
the design of the functional analysis.

• Alternate a single test condition, designed from the in-
formation obtained via interview and observation, and
an intimately matched control condition, in which only
the contingency between problem behavior and the pu-
tative reinforcer is removed.

• Select only topographically similar behavior that can be
safely exhibited as the target of the analysis (i.e., limit
the class of behaviors scheduled for the putative rein-
forcer).

• Assign salient discriminative stimuli to the test and con-
trol conditions.

• Schedule consequences to occur immediately follow-
ing each target behavior (and withhold the same conse-
quences for all other behaviors).

PERSPECTIVES 64

Table 2. Tactics to Overcome Client-Specifc Obstacles to Conducting Functional Analyses

To address concerns regarding
function detection with Consider

. . . low-rate problem
behavior

• acknowledging that because putative establishing operations are repeatedly
arranged in functional analyses, differentiated analyses can be obtained even
for reportedly low rate behavior

• extending the durations of sessions and assessments

• conducting analyses only when problem behavior is occurring

• conducting additional open-ended interviews or observations to discover
idiosyncratic factors that may be included in analyses

. . . covert problem behavior

• conducting the analysis in a baited environment and in the absence of others

• conducting a reinforcer analysis in which the likely reinforcers for problem
behavior are available concurrently and/or for arbitrary responses of similar
effort

. . . multiple topographies
of problem behavior

• restricting the class of behaviors that are reinforced in the analysis

• systematically arranging for extinction of progressively more topographies

. . . possible multiple
functions of problem
behavior

• conducting multiple test and control comparisons in succession

• testing the independent effects of different treatments based on different
functions of problem behavior

. . . what appears to be
constantly changing reinforcers
for problem behavior

• relying on the child’s requests or current activity to identify the momentarily
valuable reinforcers and establish the value of those reinforcers by briefy
denying their access

PERSPECTIVES 65

erspectives
Toward an Understanding of When to Consider

Functional Analysis

All problem behavior certainly does not require a functional
assessment as described herein prior to developing a treatment.
When consulting in classrooms, it is probably best to ensure
that there are class-wide contingencies in place that promote
desirable behavior. When consulting in homes, it is important
to detect whether parents have a sound understanding of, and
good habits relevant to, differential reinforcement of desirable
behavior.

Practitioners should also consider consulting the function-
based treatment literature to extract important skills to be
developed for the children they serve. In
other words, all skills taught following effective
functional analyses should almost certainly be
assessed and taught to all children, especially
children diagnosed with autism or intellectual
disabilities, in order to address existing prob-
lem behavior or to prevent the development of
more severe forms of problem behavior. The
particular skills taught following the identi-
fcation of reinforcers for problem behavior
include:

• Playing and other leisure skills for pro-
ducing automatic reinforcers

• Complying with typical instructions
• Recruiting and maintaining the atten-

tion of others
• Escaping and avoiding unpleasant situations
• Gaining and maintaining preferred materials
• Tolerating delays, denials, and termination of preferred

events
These skills should probably be discussed routinely by

parents and interdisciplinary team members, and some variant
should remain on all individualized educational plans. These
are life skills. Those in our care should never be passed on
these general skills; the amount, complexity, and discriminated
nature of skills in each category should simply be refned over
time. The beginning of this sort of function-based prevention
curriculum can be found in Hanley, Heal, Ingvarsson, and
Tiger (2007).

Nevertheless, problem behavior may still persist under
these conditions. Behavior-analytic practitioners should frst
determine the risk and cost for the child and their caregivers of
being prescribed an ineffective treatment. If the problem behav-
ior is dangerous or life threatening, the functional assessment
process should be initiated immediately and simultaneous with
the implementation of class-wide motivational systems and
basic parent training in contingency management.

An additional consideration is whether a stable and

ecologically relevant baseline that will be sensitive to an effective
treatment can be established in the absence of an analysis. One
of the most useful features of an effective functional analysis
is that the test condition can serve as a baseline from which
the effects of any treatment can be assessed. This attribute of
functional analysis seems especially important for behavior of a
free-operant nature, those behaviors that can occur anytime and
are not frmly anchored to any single environmental event (e.g.,
self-injurious behaviors such as hand biting or head hitting).
Without an effective functional analysis, it is often diffcult
to establish a stable baseline of problem behavior. Naturalistic
baselines of free-operant problem behavior obtained by collect-
ing data throughout the day in a myriad of conditions tends to

Open-ended interviewing and perhaps some
open-ended observation allow for the discovery of
factors infuencing problem behavior Functional
analyses are often necessary to demonstrate the
relevance of those factors Both are essential to
the functional assessment process

be highly variable and, therefore, less useful for detecting the
effects of treatment, especially in a rapid fashion. Functional
analysis seems especially important for free-operant behaviors
for which an effective baseline is diffcult to establish.

There are several problem behaviors that, in contrast,
could be classifed as restricted operants and that seem to
require little effort to establish stable and sensitive baselines.
These restricted operants are occasioned by highly specifc
environmental events. Some examples include noncompli-
ance, feeding-related problem behaviors, and sleep-interfering
behaviors. Stable baselines of these behaviors can very often
be established merely by presenting the precipitating event
(e.g., instructions, food, and the bid goodnight). In one sense,
the functional analysis is obviated because an effective and
ecologically-relevant baseline can be established in its absence.
It is under these conditions that conducting a functional analy-
sis is probably not necessary. Developing effective treatments
with only consideration of the probable positive and negative
reinforcers for these particular behaviors has been demonstrated
(e.g., Jin, Hanley, & Beaulieu, in press; Stephenson & Hanley,
2010; Valdimarsdóttir, Halldórsdóttir, & Sigurðardóttir,
2010). Omitting the functional analysis does not imply that
the controlling variables for the problem are not considered;

PERSPECTIVES 66

Table 3. Questions to Be Answered in Order to Develop an Individualized Functional Analysis

1. What problem behavior(s) will be targeted in the analysis?

2. What problem behaviors will be measured and how?

3. What are the safety precautions for the analysis? Has consent been obtained?

4. What reinforcers will be arranged in the test condition?

5. How will the value of the reinforcer be established?

6. How will the control condition be arranged?

7. What discriminative stimuli will be incorporated in test/control conditions?

8. What materials will be available in all conditions?

9. How long will sessions be? How long will the between-session time be and what will occur during that time?

10. Where will the analysis be conducted and by whom?

11. What session order will be used (what will the experimental design be)?

12. Who will graph and interpret the results?

13. Who will design and evaluate the function-based treatment?

14. Who will adjust the treatment so it is effective once extended to the school and home?

often times a more thorough open-ended interview is required
to determine the unique variables infuencing these problem
behaviors (see, for example, Jin et al., who recently showed the
utility of a particular open-ended functional assessment for de-
termining the variables infuencing sleep-interfering behaviors
among other sleep issues for young children).

Exporting the Functional Assessment Process

Numerous articles imply that teachers in classrooms or
our allied professionals (e.g., social workers, speech and lan-
guage pathologists) should be expected to conduct functional
assessments following some training. Multiple studies describ-
ing ways to train teachers to conduct analyses provide some
evidence of this expectation (e.g., Ellingson, Miltenberger,
Stricker, Galensky, & Garlinghouse, 2000; Moore et al., 2002).
I would like to suggest that Board Certifed Behavior Analysts®

(BCBAs®) conduct functional assessments of severe problem
behavior with teachers, parents, and allied professionals as
partners in the process.

As described earlier, a proper functional assessment involves
both a semistructured open-ended interview (step 1) and a
safe and informed functional analysis (step 2), in which some
event is manipulated to determine its effects on the probability

of problem behavior. Open-ended interviewing and perhaps
some open-ended observation allow for the discovery of factors
infuencing problem behavior. Functional analyses are often
necessary to demonstrate the relevance of those factors. Both
are essential to the functional assessment process (see excep-
tions in the above section).

If we break down these two steps, a proper functional assess-
ment of problem behavior then involves skills relevant to build-
ing relationships; clinical interviewing; direct measurement of
behavior; single-subject experimental designs; data graphing,
analysis, and interpretation; and reinforcement schedules and
behavioral processes. This is not something that can be or even
should be exported to teachers, social workers, speech patholo-
gists, or anyone else without the BCBA credential or other
solid evidence that they have competence with respect to all of
these skills (a list of questions to be skillfully considered prior
to an analysis is presented in Table 3). This skill set is precisely
that which should be learned in programs yielding behavior
analysis certifcation. What’s the current message in trying to
export the functional assessment process? It is that anyone can
do this thing called functional assessment and the necessary
component of the process called functional analysis. I just don’t
think that is the case. I could be taught to suture a wound, but

PERSPECTIVES 67

erspectives
that would not make me a surgeon. People can be taught to say,
“don’t do that, you’re going to hurt yourself ” when a child hits
his head in an attention test condition of a functional analysis,
but that does not make them competent in the functional as-
sessment and treatment development process.

There is something I think we should export to teachers and
all of these allied professionals, however, and that is the funda-
mental assumption that problem behavior is learned, learned like
any other behavior. This assumption should be packaged with
our other assumptions relevant to problem behavior:

• that problem behavior serves a purpose for the child,
• that all problem behavior is a function of particular en-

vironmental conditions,
• that there are not aggressive kids per se but contexts that

support aggression,
• that extraordinary behavior can develop and maintain

under rather ordinary conditions,
• that the answers to how to help children with their prob-

lem behaviors can be found in understanding the effect
their problem behavior is having on the environment.

These assumptions and their multiple framings are what
we need to export. Functional assessment for the masses should
be this heuristic, this thinking guide, to be applied every time
our allied professionals are engaged in conversations about how
to change another person’s behavior. It is vital that we teach
other professionals that if the problem behavior is persisting, it
is being reinforced. This is an assumption with great empirical
support that needs to be exported. When providing advice to a
school that has no BCBAs employed, behavior analysts should
help school personnel to develop systems that occasion people
thinking about and discussing the probable reinforcers for the
problem behavior before attempting to intervene. After these
conversations about reinforcers for the problem behavior oc-
cur, teaching school personnel how to expand the discussion
to acknowledge all aspects of a controlling contingency—the
reinforcers and the events that establish their value and signal
their availability—would be a good next step. Given the acu-
men required for proper functional assessment for problem
behavior, this is probably where our technical advice should
end. If treatments developed from these conversations do
not adequately address the problem behavior, assistance with
creating an effective employment advertisement for a full-time
BCBA should then be provided.

Having training in behavior analysis or being a BCBA
are the minimal requirements for conducting functional as-
sessments, but these histories may or may not be suffcient.
Behavior analysts who are responsible for treating severe prob-
lem behavior of children with autism should seek out academic
programs or internships that will provide them with the neces-
sary competency-based training to conduct safe and effective

functional assessments of severe problem behavior. Perhaps the
greatest legacy the feld of applied behavior analysis can leave
the world is this concept that the most relevant determinants of
problem behavior are accessible, determinable, and capable of
being changed to improve the lives of all who exhibit problem
behavior.

References

Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current
dimensions of applied behavior analysis. Journal of Applied
Behavior Analysis, 1, 91–97.

Behavior Analyst Certifcation Board. (2012). Fourth edition task
list. Retrieved from http://www.bacb.com/Downloadfles/
TaskList/BACB_Fourth_Edition_Task_List Bijou, S. W.,
Peterson, R. F., & Ault, M. H. (1968). A method to integrate
descriptive and experimental feld studies at the level of data
and empirical concepts. Journal of Applied Behavior Analysis,
1, 175–191.

Bloom, S. E., Iwata, B. A., Fritz, J. N., Roscoe, E. M., & Carreau,
A. B. (2011). Classroom application of a trial-based functional
analysis. Journal of Applied Behavior Analysis, 44, 19–31.

Bowman, L. G., Fisher, W. W., Thompson, R. H., & Piazza,
C. C. (1997). On the relation of mands and the function of
destructive behavior. Journal of Applied Behavior Analysis, 30,
251–265.

Carr, E. G., & Durand, V. (1985). Reducing behavior problems
through functional communication training. Journal of Applied
Behavior Analysis, 18, 111–126.

Carr, E. G., Taylor, J. C., & Robinson, S. (1991). The effects of
severe behavior problems in children on the teaching behavior
of adults. Journal of Applied Behavior Analysis, 24, 523–535.

Chapman, S., Fisher, W., Piazza, C. C., & Kurtz, P. F. (1993).
Functional assessment and treatment of life-threatening drug
ingestion in a dually diagnosed youth. Journal of Applied
Behavior Analysis, 26, 255–256.

Conners, J., Iwata, B. A., Kahng, S., Hanley, G. P., Worsdell,
A. S., & Thompson, R. H. (2000). Differential responding
in the presence and absence of discriminative stimuli during
multielement functional analyses. Journal of Applied Behavior
Analysis, 33, 299–308.

Derby, K. M., Wacker, D. P., Sasso, G., Steege, M., Northup, J.,
Cigrand, K., & Asmus, J. (1992). Brief functional assessments
techniques to evaluate aberrant behavior in an outpatient
setting: A summary of 79 cases. Journal of Applied Behavior
Analysis, 25, 713–721.

Desrochers, M. N., Hile, M. G., & Williams-Mosely, T. L.
(1997). Survey of functional assessment procedures used with
individuals who display mental retardation and severe problem
behaviors. American Journal on Mental Retardation, 101, 535–
546.

PERSPECTIVES 68

http://www.bacb.com/Downloadfiles

Durand, V. M., & Crimmins, D. B. (1985). The Motivation
Assessment Scale: An administration manual. Unpublished
manuscript, University at Albany, State University of New
York, Albany, NY.

Ellingson, S. A., Miltenberger, R. G., & Long, E. S. (1999). A
survey of the use of functional assessment procedures in
agencies serving individuals with developmental disabilities.
Behavioral Interventions, 14, 187–198.

Ellingson, S. A., Miltenberger, R. G., Stricker, J., Galensky, T.
L., & Garlinghouse, M. (2000). Functional assessment and
intervention for challenging behaviors in the classroom by
general classroom teachers. Journal of Positive Behavioral
Interventions, 2, 85–97.

Emerson, E., Reeves, D., Thompson, S., & Henderson, D.
(1996). Time-based lag sequential analysis and the functional
assessment of challenging behaviour. Journal of Intellectual
Disability Research, 40, 260–274.

Fahmie, T. A., & Iwata, B. A. (2011). Topographical and functional
properties of precursors to severe problem behavior. Journal of
Applied Behavior Analysis, 44, 993–997.

Fisher, W. W., Adelinis, J. D., Thompson, R. H., Worsdell, A. S.,
& Zarcone, J. R. (1998). Functional analysis and treatment
of destructive behavior maintained by termination of “don’t”
(and symmetrical “do”) requests. Journal of Applied Behavior
Analysis, 31, 339–356.

Fisher, W. W., Lindauer, S. E., Alterson, C. J., & Thompson, R.
H. (1998). Assessment and treatment of destructive behavior
maintained by stereotypic object manipulation. Journal of
Applied Behavior Analysis, 31, 513–527.

Grace, N. C., Thompson, R., & Fisher, W. W. (1996). The
treatment of covert self-injury through contingencies on
response products. Journal of Applied Behavior Analysis, 29,
239–242.

Gunter, P. L., Denny, R. K., Shores, R. E., Reed, T. M., Jack,
S. L., & Nelson, M. (1994). Teacher escape, avoidance, and
counter-control behaviors: Potential responses to disruptive
and aggressive behaviors of students with severe behavior
disorders. Journal of Child and Family Studies, 3, 211–223.

Hagopian, L. P., Fisher, W. W., Thompson, R. H., Owen-
DeSchryver, J., Iwata, B. A., & Wacker, D. P. (1997). Toward
the development of structured criteria for interpretation of
functional analysis data. Journal of Applied Behavior Analysis,
30, 313–326.

Hall, V. R., Axelrod, S., Tyler, L., Grief, E., Jones, F. C., &
Robertson, R. (1972). Modifcation of behavior problems in
the home with a parent as observer and experimenter. Journal
of Applied Behavior Analysis, 5, 53–64.

Hanley, G. P. (2010). Prevention and treatment of severe problem
behavior. In E. Mayville & J. Mulick (Eds.), Behavioral
foundations of autism intervention (pp. 233–256). New York:
Sloman Publishing.

Hanley, G. P. (2011). Functional analysis. In J. Luiselli (Ed.),
Teaching and behavior support for children and adults with
autism spectrum disorder: A “how to” practitioner’s guide. New
York: Oxford University Press.

Hanley, G. P., Heal, N. A., Ingvarsson, E. T., & Tiger, J. H. (2007).
Evaluation of a class-wide teaching program for developing
preschool life skills. Journal of Applied Behavior Analysis, 40,
277–300.

Hanley, G. P., Iwata, B. A., & McCord, B. E. (2003). Functional
analysis of problem behavior: A review. Journal of Applied
Behavior Analysis, 36, 147–185.

Hanley, G. P., Iwata, B. A., & Thompson, R. H. (2001).
Reinforcement schedule thinning following treatment with
functional communication training. Journal of Applied Behavior
Analysis, 34, 17–38.

Herscovitch, B., Roscoe, E. M., Libby, M. E., Bourret, J. C., &
Ahearn, W. H. (2009). A procedure for identifying precursors
to problem behavior. Journal of Applied Behavior Analysis, 42,
697–702.

Hineline P. N., & Groeling, S. M. (2010). Behavior analytic
language and interventions for autism. In E. Mayville & J.
Mulick (Eds.), Behavioral foundations of autism intervention
(pp. 35–56). New York: Sloman Publishing.

Iwata, B. A., Dorsey, M. F., Slifer, K. J., Bauman, K. E., &
Richman, G. S. (1994). Toward a functional analysis of self-
injury. Journal of Applied Behavior Analysis, 27, 197–209.
(Reprinted from Analysis and Intervention in Developmental
Disabilities, 2, 3–20, 1982).

Iwata, B. A., & Dozier, C. L. (2008). Clinical application of
functional analysis methodology. Behavior Analysis in Practice,
1, 3–9.

Iwata, B. A., Duncan, B. A., Zarcone, J. R., Lerman, D. C., &
Shore, B. A. (1994). A sequential, test-control methodology
for conducting functional analyses of self-injurious behavior.
Behavior Modifcation, 18, 289–306.

Iwata, B. A., Pace, G. M., Cowdery, G. E., & Miltenberger,
R. G. (1994). What makes extinction work: An analysis of
procedural form and function. Journal of Applied Behavior
Analysis, 27, 131–144.

Iwata, B. A., Pace, G. M., Dorsey, M. F., Zarcone, J. R., Vollmer,
T. R., Smith, R. G., . . . Willis, K. D. (1994). The functions
of self-injurious behavior: An experimental-epidemiological
analysis. Journal of Applied Behavior Analysis, 27, 215–240.

PERSPECTIVES 69

erspectives
Iwata, B. A., Wallace, M. D., Kahng, S., Lindberg, J. S., Roscoe, E.

M., Conners, J., Hanley, G. P., Thompson, R. H., & Worsdell,
A. S. (2000). Skill acquisition in the implementation of
functional analysis methodology. Journal of Applied Behavior
Analysis, 33, 181–194.

Iwata, B. A., Wong, S. E., Riordan, M. M., Dorsey, M. F., & Lau,
M. M. (1982). Assessment and training of clinical interviewing
skills: Analogue analysis and feld replication. Journal of Applied
Behavior Analysis, 15, 191–203.

Jin, C. S., Hanley, G. P., & Beaulieu, L. (in press). An individualized
and comprehensive approach to treating sleep problems in
young children. Journal of Applied Behavior Analysis.

Kahng, S., Abt, K. A., & Schonbachler, H. E. (2001). Assessment
and treatment of low-rate high-intensity problem behavior.
Journal of Applied Behavior Analysis, 34, 225–228.

Lalli, J. S., Browder, D. M., Mace, F. C., & Brown, D. K.
(1993). Teacher use of descriptive analysis data to implement
interventions to decrease students’ problem behavior. Journal
of Applied Behavior Analysis, 26, 227–238.

Lerman, D. C., & Iwata, B. A. (1993). Descriptive and
experimental analysis of variables maintaining self-injurious
behavior. Journal of Applied Behavior Analysis, 26, 293–319.

Mace, F. C. (1994). The signifcance and future of functional
analysis methodologies. Journal of Applied Behavior Analysis,
27, 385–392.

Mace, F. C., & Lalli, J. S. (1991). Linking descriptive and
experimental analyses in the treatment of bizarre speech.
Journal of Applied Behavior Analysis, 24, 553–562.

Magee, S. K., & Ellis, J. K. (2000). Extinction effects during the
assessment of multiple problem behaviors. Journal of Applied
Behavior Analysis, 33, 313–316.

McComas, J. J., & Mace, F. C. (2000). Theory and practice in
conducting functional analyses. In E. S. Shapiro & T. R.
Kratochwill (Eds.), Behavioral assessment in schools (2nd ed.,
pp. 78–103). New York: Guilford Press.

McKerchar, P. M., & Thompson, R. H. (2004). A descriptive
analysis of potential reinforcement contingencies in the
preschool classroom. Journal of Applied Behavior Analysis, 37,
431–444.

Meyer, K. A. (1999). Functional analysis and treatment of problem
behavior exhibited by elementary school children. Journal of
Applied Behavior Analysis, 32, 229–232.

Moore, J. W., Edwards, R. P, Sterling-Turner, H. E., Riley, J.,
Dubard, M., & McGeorge, A. (2002). Teacher acquisition of
functional analysis methodology. Journal of Applied Behavior
Analysis, 35, 72–77.

Mueller, M. M., & Nkosi, A. (2006). Behavior analytic consultation
to schools. Atlanta GA: Stimulus Publications.

Newcomer, L. L., & Lewis, T. J. (2004). Functional behavioral
assessment: An investigation of assessment reliability and
effectiveness of function-based interventions. Journal of
Emotional and Behavioral Disorders, 12, 168–181.

Newton, J. T., & Sturmey, P. (1991). The Motivation Assessment
Scale: Inter-rater reliability and internal consistency in a British
sample. Journal of Mental Defciency Research, 35, 472–474.

Nicholson, J., Konstantinidi, E., & Furniss, F. (2006). On some
psychometric properties of the questions about behavioral
function (QABF) scale. Research in Developmental Disabilities,
27, 337–352.

Northup, J., Wacker, D., Sasso, G., Steege, M., Cigrand, K.,
Cook, J., & DeRaad, A. (1991). A brief functional analysis
of aggressive and alternative behavior in an outclinic setting.
Journal of Applied Behavior Analysis, 24, 509–522.

O’Neill, R. E., Horner, R. H., Albin, R. W., Sprague, J. R., Storey,
K., & Newton, J. S. (1997). Functional assessment and program
development for problem behavior: A practical handbook (2nd
ed.). Pacifc Grove, CA: Brooks/Cole.

O’Neill, R. E., & Johnson, J. W. (2000). Research and practice
for persons with severe disabilities. Journal of the Association for
Persons with Severe Handicaps, 25, 197–200.

Paclawskyj, T. R., Matson, J. L., Rush, K. S., Smalls, Y., & Vollmer,
T. R. (2000). Questions About Behavioral Function (QABF):
A behavioral checklist for functional assessment of aberrant
behavior. Research in Developmental Disabilities, 21, 223–229.

Pelios, L., Morren, J., Tesch, D., & Axelrod, S. (1999). The impact
of functional analysis methodology on treatment choice
for self-injurious and aggressive behavior. Journal of Applied
Behavior Analysis, 32, 185–195.

Piazza, C. C., Hanley, G. P., & Fisher, W. W. (1996). Functional
analysis and treatment of cigarette pica. Journal of Applied
Behavior Analysis, 29, 437–450.

Ringdahl, J. E., & Sellers, J. A. (2000). The effects of different
adults as therapists during functional analyses. Journal of
Applied Behavior Analysis, 33, 247–250.

Risley, T. R. (1968). The effects and side effects of punishing the
autistic behaviors of a deviant child. Journal of Applied Behavior
Analysis, 1, 21–34.

Rooker, G. W., Hagopian, L. P., DeLeon, I. G., & Jessel, J. (in
press). Clarifcation of undifferentiated functional analysis
outcomes. Journal of Applied Behavior Analysis.

Sasso, G. M., Reimers, T. M., Cooper, L. J., Wacker, D., Berg,
W., Steege, M., Kelly, L., & Allaire, A. (1992). The use of
descriptive and experimental analyses to identify the functional
properties of aberrant behavior in school settings. Journal of
Applied Behavior Analysis, 25, 809–821.

PERSPECTIVES 70

Shogren, K. A., & Rojahn, J. (2003). Convergent reliability and
validity of the Questions About Behavioral Function and the
Motivation Assessment Scale: A replication study. Journal of
Developmental and Physical Disabilities, 15, 367–375.

Sidman, M. (1960). Tactics of scientifc research. New York: Basic
Books.

Sigafoos, J., & Saggers, E. (1995). A discrete-trial approach to the
functional analysis of aggressive behaviour in two boys with
autism. Australia & New Zealand Journal of Developmental
Disabilities, 20, 287–297.

Smith, R. G., & Churchill, R. M. (2002). Identifcation of
environmental determinants of behavior disorders through
functional analysis of precursor behaviors. Journal of Applied
Behavior Analysis, 35, 125–136.

Smith, R. G., Iwata, B. A., Vollmer, T. R., & Zarcone, J. R. (1993).
Experimental analysis and treatment of multiply controlled
self-injury. Journal of Applied Behavior Analysis, 26, 183–196.

St. Peter, C. C., Vollmer, T. R., Bourret, J. C., Borrero, C. S. W.,
Sloman, K. N., & Rapp, J. T. (2005). On the role of attention
in naturally occurring matching relations. Journal of Applied
Behavior Analysis, 38, 429–443.

Stephenson, K. M., & Hanley, G. P. (2010). Preschoolers’
compliance with simple instructions: A description and
experimental evaluation. Journal of Applied Behavior Analysis,
43, 229–247.

Tarbox, J., Wallace, M. D., Tarbox, R. S. F., Landaburu, H. J., &
Williams, L. W. (2004). Functional analysis and treatment of
low rate behavior in individuals with developmental disabilities.
Behavioral Interventions, 19, 187–204.

Taylor, J., & Miller, M. (1997). When timeout works some of the
time: The importance of treatment integrity and functional
assessment. School Psychology Quarterly, 12, 4–22.

Thomason-Sassi, J. L., Iwata, B. A., Neidert, P. L., & Roscoe, E.
M. (2011). Response latency as an index of response strength
during functional analyses of problem behavior. Journal of
Applied Behavior Analysis, 44, 51–67.

Thompson, R. H., Fisher, W. W., Piazza, C. C., & Kuhn, D. E.
(1998). The evaluation and treatment of aggression maintained
by attention and automatic reinforcement. Journal of Applied
Behavior Analysis, 31, 103–116.

Thompson, R. H., & Iwata, B. A. (2001). A descriptive assessment
of social consequences following problem behavior. Journal of
Applied Behavior Analysis, 34, 169–178.

Thompson, R. H., & Iwata, B. A. (2005). A review of reinforcement
control procedures. Journal of Applied Behavior Analysis, 38,
257–278.

Thompson, R. H., & Iwata, B. A. (2007). A comparison of
outcomes from descriptive and functional analyses of problem
behavior. Journal of Applied Behavior Analysis, 40, 333–338.

Tiger, J. H., Hanley, G. P., & Bessette, K. (2006). Incorporating
descriptive assessment results into the design of functional
analyses: A case example involving a preschooler’s
handmouthing. Education and Treatment of Children, 29,
107–124.

Valdimarsdóttir, H., Halldórsdóttir, L. Y., & Sigurðardóttir, Z. G.
(2010). Increasing the variety of foods consumed by a picky
eater: Generalization of effects across caregivers and settings.
Journal of Applied Behavior Analysis, 43, 101–105.

Vollmer, T. R., Borrero, J. C., Wright, C. S., Van Camp, C., &
Lalli, J. S. (2001). Identifying possible contingencies during
descriptive analyses of problem behavior. Journal of Applied
Behavior Analysis, 34, 269–287.

Wallace, M. D., & Iwata, B. A. (1999). Effects of session duration
on functional analysis outcomes. Journal of Applied Behavior
Analysis, 32, 175–183.

Wallace, M. D., & Knights, D. J. (2003). An evaluation of a brief
functional analysis format within a vocational setting. Journal
of Applied Behavior Analysis, 36, 125–128.

Weber, K. P., Killu, K., Derby, K. M., & Barretto, A. (2005). The
status of functional behavioral assessment (FBA): Adherence
to standard practice in FBA methodology. Psychology in the
Schools, 42, 737–744

Worsdell, A. S., Iwata, B. A., Conners, J., Kahng, S., & Thompson,
R. H. (2000). Relative infuences of establishing operations and
reinforcement contingencies on self-injurious behavior during
functional analyses. Journal of Applied Behavior Analysis, 33,
451–461.

Zarcone, J. R., Rodgers, T. A., Iwata, B. A., Rourke, D. A., &
Dorsey, M. F. (1991). Reliability analysis of the Motivation
Assessment Scale: A failure to replicate. Research in
Developmental Disabilities, 12, 349–362.

Author Note

I would like to thank the doctoral students who have
enrolled in Western New England University’s Behavioral
Assessment course over the past two years for their thoughtful
discussions that helped refne my own thinking about func-
tional analysis in practice and for their and Rachel Thompson’s
expert feedback on an earlier version of this manuscript. Author
correspondence can be directed to ghanley@wne.edu.

Action Editor: James Carr

PERSPECTIVES 71

mailto:ghanley@wne.edu

erspectives
Appendix

Open-Ended Functional Assessment Interview Date of Interview: _________________

Child/Client: __________________________ Respondent: _________________________

Respondent’s relation to child/client: ___________________ Interviewer: _________________________

RELEVANT BACKGROUND INFORMATION

1. His/her date of birth and current age: ____-_____-_________ ____yrs ____mos
Male/Female

2. Describe his/her language abilities.
3. Describe his/her play skills and preferred toys or leisure activities.
4. What else does he/she prefer?

QUESTIONS TO INFORM THE DESIGN OF A FUNCTIONAL ANALYSIS

To develop objective definitions of observable problem behaviors:
5. What are the problem behaviors? What do they look like?

To determine which problem behavior(s) will be targeted in the functional analysis:
6. What is the single-most concerning problem behavior?
7. What are the top 3 most concerning problem behaviors? Are there other behaviors of concern?

To determine the precautions required when conducting the functional analysis:
8. Describe the range of intensities of the problem behaviors and the extent to which he/she or others may

be hurt or injured from the problem behavior.

To assist in identifying precursors to dangerous problem behaviors that may be targeted in the functional analysis
instead of more dangerous problem behaviors:
9. Do the different types of problem behavior tend to occur in bursts or clusters and/or does any type of

problem behavior typically precede another type of problem behavior (e.g., yelling preceding hitting)?

To determine the antecedent conditions that may be incorporated into the functional analysis test conditions:
10. Under what conditions or situations are the problem behaviors most likely to occur?
11. Do the problem behaviors reliably occur during any particular activities?
12. What seems to trigger the problem behavior?
13. Does problem behavior occur when you break routines or interrupt activities? If so, describe.
14. Does the problem behavior occur when it appears that he/she won’t get his/her way? If so, describe the

things that the child often attempts to control.

To determine the test condition(s) that should be conducted and the specific type(s) of consequences that may be
incorporated into the test condition(s):
15. How do you and others react or respond to the problem behavior?
16. What do you and others do to calm him/her down once he/she engaged in the problem behavior?
17. What do you and others do to distract him/her from engaging in the problem behavior?

In addition to the above information, to assist in developing a hunch as to why problem behavior is occurring and to
assist in determining the test condition(s) to be conducted:
18. What do you think he/she is trying to communicate with his/her problem behavior, if anything?
19. Do you think this problem behavior is a form of self stimulation? If so, what gives you that impression?
20. Why do you think he/she is engaging in the problem behavior?

PERSPECTIVES 72

Calculate your order
275 words
Total price: $0.00

Top-quality papers guaranteed

54

100% original papers

We sell only unique pieces of writing completed according to your demands.

54

Confidential service

We use security encryption to keep your personal data protected.

54

Money-back guarantee

We can give your money back if something goes wrong with your order.

Enjoy the free features we offer to everyone

  1. Title page

    Get a free title page formatted according to the specifics of your particular style.

  2. Custom formatting

    Request us to use APA, MLA, Harvard, Chicago, or any other style for your essay.

  3. Bibliography page

    Don’t pay extra for a list of references that perfectly fits your academic needs.

  4. 24/7 support assistance

    Ask us a question anytime you need to—we don’t charge extra for supporting you!

Calculate how much your essay costs

Type of paper
Academic level
Deadline
550 words

How to place an order

  • Choose the number of pages, your academic level, and deadline
  • Push the orange button
  • Give instructions for your paper
  • Pay with PayPal or a credit card
  • Track the progress of your order
  • Approve and enjoy your custom paper

Ask experts to write you a cheap essay of excellent quality

Place an order