Professional Documents
Culture Documents
Counseling Services
By
Rebecca Cole
School of Education
Department of Educational and Counseling Psychology
2012
UMI 3511677
Copyright 2012 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.
ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346
UMI 3511677
Copyright 2012 by ProQuest LLC.
All rights reserved. This edition of the work is protected against
unauthorized copying under Title 17, United States Code.
ProQuest LLC.
789 East Eisenhower Parkway
P.O. Box 1346
Ann Arbor, MI 48106 - 1346
Acknowledgements
As I put the finishing touches on my dissertation and my graduate experience at
the University at Albany, I am filled with gratitude for all the people who have helped me
to get to this point in my career. I would like to begin by recognizing Dr. Deborah
Kundert, my dissertation chair. She has worn many hats over the years that we have
known each other, beginning as one of my professors, and then becoming my supervisor,
advisor, and mentor. Deb was a natural choice as a chairperson, as she has always
exuded the qualities of professionalism, attention to detail, and a strong work ethic that I
greatly admire and seek to develop in myself. Thank you, Deb, for helping me to be a
woman on a mission, and for devoting so much time, energy, and support to me over the
years. I would not have achieved my dissertation goals without you. Many thanks also
go to Dr. Kevin Quinn and Dr. Stacy Williams, my committee members, for carefully
reading and providing valuable feedback that helped to shape my dissertation. Stacy has
also played multiple roles in my development as a school psychologist, and I am
fortunate to be able to take so many lessons away from our experiences together. Thank
you, Stacy, for building me back up and for helping me to stay the course when things
got challenging.
Every endeavor I have pursued has been supported by my parents, Mary and Gary
Cole, who have always taught me that anything worth having is worth working hard for.
Thanks, Mom and Dad, for always being in my corner, for always picking up the phone
(no matter what time it was), for all those signatures, and for all your words and gestures
of support over the years. My sisters, Aleta, Colleen, and Rita, have always reminded me
that, despite the sacrifice, all the hard work would be worth it, and as I take time to
ii
celebrate with them, it is clear that they were right. I have also been blessed with a large
extended family, who always asked, How is school going? at regular intervals, while
also wishing me luck and marveling at my achievements, before finally asking if they
would need to call me doctor after graduation. When I think of the invaluable support
my family has provided, I am also reminded of my grandmother, the late Ann Malloy,
who, as I was getting ready to return to Albany after visiting (always too soon), would
remind me that she loved me and that the problems will still be there on Monday. You
were right, Grandma, and I know you wish you were here to celebrate with me as much
as I wish you were.
Thanks also go out to my friends for always being there for me, and for never
holding a grudge over my inability to keep in touch, or when I cancelled plans at the last
minute to study, complete an assignment, or work on my dissertation. Special thanks go
out to Romann Weber, for playing the role of boyfriend, best friend, lifeline, coach, and
therapist. Your support has proved integral in getting me through my dissertation,
whether this involved well-placed humor, defusing statistics or formatting tantrums, or
just reassuring me that I knew what I was doing and that everything would work out. I
am indebted to you for your unwavering support and the time and energy you sacrificed
from your own work to help me get where I am. Thank you, and I love you very much.
Finally, I wish to acknowledge the myriad teachers, administrators, service providers,
paraprofessionals, families and students that I have worked with over the years as a
school psychologist and psychologist in training, and as a teacher. These experiences
have motivated me and taught me more than any class or lecture could have, and for this I
am truly grateful.
iii
iv
These results suggest that school psychologists apply many of the steps of the problemsolving model in accordance with federal special education laws, especially when
defining target behaviors and planning interventions. These results, however, also call
into question the degree to which these school psychologists engage in progress
monitoring and data-based decision making. The quality and frequency of baseline and
progress monitoring data collection may not enable accurate comparison of student
behavior and demonstration that behavioral improvement has been made. Further
research is needed to determine barriers and facilitators to objective data gathering in
practice settings. Implications for school psychology training programs include
knowledge of practices to focus on in order to help new and current practitioners make a
paradigm shift from assessors to researchers and active problem-solvers who consistently
and effectively implement all aspects of the problem-solving model and consult with
other school professionals to gather and evaluate behavioral data and demonstrate
accountability.
TABLE OF CONTENTS
Page
TITLE ................................................................................................................................. i
ACKNOWLEDGEMENTS ............................................................................................... ii
ABSTRACT ...................................................................................................................... iv
TABLE OF CONTENTS .................................................................................................. vi
LIST OF TABLES ............................................................................................................ ix
LIST OF APPENDICES ................................................................................................... xi
CHAPTER 1 INTRODUCTION .................................................................................... 1
Purpose of Study .................................................................................................... 6
Significance of Study ............................................................................................. 6
CHAPTER 2 REVIEW OF RELEVANT LITERATURE ............................................. 8
Overview ................................................................................................................ 8
School Psychology: History, Approaches, Training, and Current Status ............ 11
Growth and Development of the Field of School Psychology ................ 11
Refinements in the Provision of School Psychological Services............. 15
Problem-Solving Approaches .................................................................. 16
Training .................................................................................................... 18
Contemporary Roles and Functions ......................................................... 20
Section Summary ..................................................................................... 22
School Psychologists as Counselors .................................................................... 23
Definitions of Counseling and Intervention ............................................. 23
Rationale for School Psychologists and Counseling ............................... 25
Current Counseling Practices of School Psychologists ........................... 29
Section Summary ..................................................................................... 32
Current Best Practices in Counseling .................................................................. 33
Evidence-Based Practices/Evidence-Based Interventions ....................... 34
Factors Supporting the Exclusive Use of Evidence-Based Practices ...... 39
Current Evidence-Based Practices Related to Counseling ...................... 40
Section Summary ..................................................................................... 45
Designing and Evaluating Direct Interventions ................................................... 46
vi
viii
LIST OF TABLES
Table
Page
LIST OF APPENDICES
Appendix
Page
xi
CHAPTER 1: Introduction
As the field of school psychology has grown over time, the duties of the school
psychologist have changed in response to the needs of students, mandates from state and
federal government, and research defining practices proven to be effective for meeting
the academic and behavioral needs of students (Fagan, 2008). The emergence and
growth of the field of school psychology occurred in tandem with the development of the
modern American public school. Efforts at special education in response to the growth
and diversity of the student body in American schools created the need for a variety of
new positions within the school, including attendance officers, guidance counselors,
school nurses, school psychologists, school social workers, and vocational counselors
(Fagan, 1992).
The work of early school psychologists developed as part of several movements,
particularly child study, the beginnings of psychology as a distinct field, and the
emergence of psychological and educational testing (Reisman, 1966; Wallin & Ferguson,
1967). Another sign of development was the mandatory provision of psychological
services to explore deficits in student behavior being written into state laws and
regulations by departments of education (Hollingworth, 1932). Factors beyond the
school setting, such as population increases after World War II, and special education
laws requiring psychoeducational evaluations (Fagan, 2008), also helped promote growth
in the field, as school psychologists began to encounter larger student bodies with diverse
educational and behavioral needs. Current problem-solving approaches used to address
these needs look at the childs response to intervention, based on direct assessment of
observable behaviors, with less time devoted to understanding underlying traits or skills
(Fagan, 2008; Lichtenstein, 2008). At this time, however, the extent to which the
problem-solving approach informs the current practices of school psychologists is
unknown.
To ensure compliance with federal special education requirements, states passed
credentialing standards requiring that school psychologists complete certain courses and
gain specific experiences before working with children. These laws and credentialing
standards shaped training in school psychology. Over time, training experiences have
expanded to prepare school psychologists to conduct a variety of assessments and design
interventions based on their results, along with providing consultation and prevention
services in schools through a combination of classwork, practica, and internship
experiences (Fagan, 1986, 2008).
In order to meet the needs of students, the activities and responsibilities of the
school psychologist have also evolved. The primary and most enduring role of the school
psychologist has been assessment for the purposes of determining appropriate placement
and educational experiences. The second role entails designing and implementing direct
interventions for academic and behavioral issues, while the third involves applying
interventions on an indirect or systems-level basis as a consultant. Research on time
allotted to these different activities reveals that most school psychologists spend almost
half of their time on assessment, while devoting the remaining half on direct
interventions, and systems-level consultation and research (Bramlett, Murphy, Johnson,
Wallingsford, & Hall, 2002; Fisher, Jenkins, & Crumbley, 1986; Goldwasser, Meyers,
Chistenson, & Graden, 1983; Hartshorne & Johnson, 1985; Lacayo, Sherwood, & Morris,
1981; Meacham & Peckam, 1978; Reschly & Wilson, 1995; Smith, 1984).
the laws already in place, it would appear that accountability for student outcomes is one
theme that will continue to guide the activities of school professionals, including school
psychologists.
Purpose of Study
The purpose of the current study was to survey school psychologists about their
current counseling practices to determine whether they are implementing research and
best practice guidelines related to the use of evidence-based interventions, progress
monitoring, and data-based decision making. In addition, this study explored whether
specific aspects of counseling practices varied according to demographic characteristics.
Significance of Study
The notion of accountability is a common theme that is repeatedly mentioned not
only in the literature describing best practices in counseling and mental health, but also in
current and proposed federal education legislation (Wright & Wright, 2009). School
psychologists have expressed a desire to spend more time involved in counseling as a
direct intervention to meet the mental health and behavioral needs of their students
(Agresta, 2004). The use of evidence-based interventions, progress monitoring, and databased decision making are measures of accountability that can be applied to the design
and implementation of counseling as a direct intervention and may provide assurance to
school psychologists that their efforts are addressing the mental health and behavioral
needs of their students. In addition, although, at this time, accountability standards and
consequences apply to educational practices and outcomes, as well as managed care and
the private sector (Kazdin, 2008), in future, accountability standards and consequences
may also govern school psychologists as they address mental health and behavioral
education programs (e.g., ESEA Federal Amendments [1974]), and how federal funding
should be distributed (e.g., the Educational Consolidation Improvement Act [1982; Gray,
Caulley, & Smith, 1982]). Issues related to funding and appropriate educational
opportunities are two themes that can also be traced throughout the evolution of
legislation focusing on students with educational exceptionalities. Once access to public
education had been secured, quality educational opportunities were defined initially as
appropriate placements and services (e.g., the Education of the Handicapped Act
Amendments [1986; House Committee on Education and Labor, 1986]), with later
specifications mandating educational experiences with non-disabled peers (e.g.,
Individuals With Disabilities Education Act [1990]; School of Public Health and Health
Professions, University at Buffalo, 2005), and an explicit connection between special
education and the general curriculum (e.g., Individuals With Disabilities Education Act;
National Center for Children and Youths with Disabilities [1998]).
Currently, legislation is focused on accountability for student outcomes. An early
example of accountability can be seen in the 1988 amendments to the ESEA, the
Hawkins-Stafford School Improvement Amendments (House Committee on Education
and Labor, 1988), as it made federal funding contingent on documented gains in student
achievement, with increased regulation for schools unable to demonstrate student
improvement. Since this time, standards defining adequate progress and corrective
actions when gains in achievement have not been made have become more specified, and
have had a significant impact on public education (Nelson & Weinbaum, 2009). A
contemporary example of accountability legislation is the No Child Left Behind Act
(NCLB; 2002), with provisions detailing appropriate training for teachers and
10
11
Table 1
Summary of Important People and Events Contributing to the Development of the Field
of School Psychology
Event/Person
Social Phenomena
Compulsory attendance laws
Immigration
Rapid increase in the number of
students attending American public
schools (Grant & Eiden, 1980)
Implications
School staff are presented with
challenges related to truancy, learning
difficulties, and discipline (Irwin, 1915;
Reisner, 1915)
Schools began to offer medical and
psychological examinations (Wallin,
1914), as well as special education
classes (Fagan, 1992; Van Sickle,
Witmer, & Ayers, 1911)
Lightner Witmer
Table 1 continued
Event/Person
Implications
Age norms for children (Hollingworth,
1932)
Tests suitable for group administration
(Cutts, 1955)
Developments and refinements in the
use of psychoeducational testing
allowed school psychologists to
classify students in an objective and
standardized fashion
Table 1 continued
Event/Person
Implications
Provided representation for school
psychologists in matters of public
policy
Promoted the use of best practices
through the dissemination of
information
Established nationally recognized
training and certification standards
(Benjamin & Baker, 2003)
14
the child study movement and in early psychological clinics. This knowledge was
applied to the creation of standardized psychoeducational tests, which provided an
objective way of measuring student abilities. Early psychological tests were then piloted
and improved as their use became mandated by state and federal laws. The remainder of
this section will provide greater depth related to changes in service provision, problemsolving approaches, training, and roles and functions, to describe how the field of school
psychology has developed and evolved.
Refinements in the provision of school psychological services. The discussions
held and consensus reached at the Thayer Conference had a profound impact on the field
of school psychology that helped to solidify the professional identity of the school
psychologist (French, 1984). Fagan (2008) provided an excellent description of the
evolution of the field of school psychology that serves to explain the forces behind the
growth in the number of school psychology practitioners and the refinement in service
ratios. Societal factors impacting the growth in the number of school psychology
practitioners included an increase in the population after World War II, and the passage
of special education laws requiring the provision of psychological services (Fagan, 2008).
Early school psychological services were provided in the schools, through external
agencies. By 1960, however, the school was the primary employment location for school
psychologists. In the 1980s, federal laws regarding evaluations for students with
disabilities, increases in insurance reimbursement for psychologists, and growth in the
number of school psychologists earning doctoral degrees contributed to variety in the
locations where school psychological services were to be provided (Fagan, 2008).
Contemporary practice settings include public and private schools, private practices, state
15
16
affecting the behavior and educational achievement of the students with whom they
worked. In addition to standardized and unstandardized measures of student intelligence
and academic achievement, practitioners now used formal and informal measures to
gather information on the students peer relationships, feelings about school, relationship
with parents or guardians, instructional environment, and behavior with and without
peers (Fagan, 2008). Child-centered interventions enjoyed continued use, and were
refined by improvements in special education and therapies for groups and individuals,
informed by developments in child-centered, rational-emotive, and behavior modification
theories (Fagan, 2008). Special education placements at this time became less restrictive,
as mainstreaming and inclusion practices became popular. Interventions were now also
provided indirectly, as school psychologists consulted with teachers and parents (Porter
& Holzberg, 1978).
Although the ecological approach had a significant impact on the practice of
school psychology, more contemporary approaches focus on assessment and the design
of corresponding interventions based on directly observable academic skills and
behaviors, with less of a focus on underlying abilities or traits (Lichtenstein, 2008).
School psychologists still consider a range of ecological variables related to peers,
teachers, classroom environment, instruction, and parenting, but only as these relate to
the creation of outcome-based assessments and interventions (Batsche, Castillo, Dixon, &
Forde, 2008). Provisions in special education laws requiring functional behavioral
assessment and treatment accountability reinforced a focus on outcomes, along with
increased priority placed on the use of empirically supported interventions as articulated
by clinical, counseling, and school psychology. Current problem-solving approaches
17
now directly assess academic skills embedded in the school curriculum, with
interventions directly linked to skill development, rather than relying entirely on
normative assessment measures (Hosp, 2008). The movement towards outcome-based
and empirically valid practices is exemplified in the response to intervention (RTI)
model. Procedures used within the RTI model include pre-referral assessment and
screening, assessment, and intervention based directly on classroom curriculum,
criterion-referenced measurement, evidence-based practices, and problem-solving
consultation (Casbarro, 2008). The use of RTI represents a distinct shift from childcentered approaches to assessment and intervention, but is also one supported by current
federal legislation (Lichtenstein, 2008).
Training. Growth and refinement in the field of school psychology were
accompanied and made possible by increases and changes in preparation programs. At
the time of the Thayer Conference, 28 institutions offered training in school psychology
(Cutts, 1955). As time progressed, the number of institutions offering school psychology
training programs increased from 28 in 1954 to 211 in 1984 (Fagan, 1986). The National
Association of School Psychology (NASP) program directory supplements these data,
indicating that in 1989, there were 231 programs, with more current figures from 2008
listing 238 institutions that provided programs in school psychology (Fagan, 2008).
A look at the development of school psychology training programs provides
another lens for examining the evolution of the field. Early school psychology training
programs provided their students with a foundation in clinical psychology, educational
psychology, or a blend of the two (Fagan, 2008). Up until the 1960s, the content offered
in training programs was primarily determined by the background and interests of faculty
18
19
solving.
Contemporary roles and functions. Over time, the role and function of the
school psychologist has also changed. Fagan (2008) described four roles that school
psychologists have played at various points in the history of the field. The first, and most
enduring role, is that of the assessor, whose primary responsibility has been the
administration of psychoeducational assessments for the purpose of special education
placement. This role has evolved, such that the school psychologist is now part of a team
of service providers who use a vast array of assessment tools to determine placement and
services for a variety of students in schools. At times, the role of the assessor has also
included identifying and designing interventions for students considered to be at-risk for
academic and social-emotional difficulties in school. The second role is that of the direct
interventionist, where the school psychologist provides individual and group intervention
in the form of academic remediation or counseling. The range of interventions provided
by school psychologists has expanded over the years as a result of improvements in
training and internships, acceptability of more direct interventions such as psychotherapy
in schools, and a focus on the need for the provision of mental health services to students
in schools (Fagan, 2008; Sandoval, 1993; Talley & Short, 1996). The role of the
consultant is the third function, and while this role was practiced by early school
psychologists, only in recent decades has it become a well-researched and theory-based
process (Fagan, 2008). An extension of the consultant role is the more recent function of
the school psychologist as a systems-level interventionist who designs assessments,
interventions, and prevention efforts at the school level, as opposed to the individual level
(NASP, 2010).
20
21
counseling activities because they see the need for and value of such services in their
schools.
In addition to the practitioner focus on mental health, professional bodies, such as
NASP, have also responded to concerns and research in this area. Recent NASP position
papers (2006) have recommended that schools provide a range of comprehensive mental
health services where there is found to be a need, in order to promote academic
achievement, school connectedness and community, respectful behavior, student wellbeing, and a positive school climate. Current training guidelines (NASP, 2000) have
recommended that school psychologists receive training in designing and implementing
prevention and intervention programs to foster the mental health and physical wellbeing
of the student bodies they serve. Because of their training in mental health and
education, NASP (2006) has advocated for continued role expansion for school
psychologists as effective providers of a wide range of interventions to be delivered in the
school building, such as universal prevention and intervention, specific interventions for
students at risk, and comprehensive interventions as designed or reinforced by
community agencies.
Section Summary. Although the exact direction that the field of school
psychology will take in the future is difficult to predict, a brief look at the history of the
field makes it clear that the role and function of the school psychologist will continue to
evolve. The effect of federal legislation, state credentialing regulations, and training
standards from professional bodies such as APA, NCATE, and NASP will likely continue
to influence training programs and the daily functioning of school psychologists. The
argument can also be made that societal factors, which were largely responsible for the
22
emergence and establishment of the field of school psychology, will also continue to
influence future directions. Concern for the mental health and well-being of children in
schools is beginning to exert considerable influence on school psychologists, training
programs, and professional bodies.
School Psychologists as Counselors
In addition to focusing on educational needs, one of the roles of the school
psychologist has always been addressing the social, emotional, and behavioral needs of
students in school (Doll & Cummings, 2008). Counseling has always been one of the
roles played by the school psychologist, although the time spent on this activity has
periodically been limited by responsibilities related to assessment; comfort, and
willingness by the school psychologist to offer counseling; and, the presence of other
providers of counseling services within a school building (Murphy, 2008). An
understanding of current research and best practices related to evidence-based
interventions, and designing and implementing counseling as a direct intervention is one
way to promote this role in cases where school psychologists report experiencing
discomfort. Given the current focus on mental health needs and student well-being,
knowledge describing effective practice is essential to ensuring that counseling
interventions produce positive social-emotional outcomes for students.
Definitions of counseling and intervention. Many definitions of counseling
have been offered by different authors and professional organizations. For example, the
American Counseling Association (ACA; 2011) defines counseling as a professional
relationship that empowers diverse individuals, families, and groups to accomplish
mental health, wellness, education, and career goals (n. pag). According to the
23
24
25
evaluation by Tolan and Dodge (2005) that students in America are currently
experiencing a mental health crisis. For instance, Huang et al. (2005) reported that 1 in 5
children have a diagnosable mental disorder. Furthermore, 3-7% of children have been
diagnosed with Attention-Deficit/Hyperactivity Disorder (Root & Resnick, 2003).
According to the American Psychiatric Association (2000), some of the contemporary
issues currently found to have a negative impact on children include family disputes,
child abuse, attention disorders, and violence. One in seven children are reported to have
been punched, kicked, or choked by a parent (Moore, 1994). In addition, Crespi and
Howe (2002) estimate that approximately 80% of children have been exposed to some
form of spousal abuse. One in six families have had to cope with the effects of parental
alcoholism, resulting in 28-34 million people who have directly experienced life in an
alcoholic family (Newcomb, Galaif, & Locke, 2001).
It is reported that more than 8 million children are in need of psychological
services (Carnegie Council on Adolescent Development, 1996), but that most youth with
a psychological disorder never receive mental health care (Farmer, Burns, Philip, Angold,
& Costello, 2003; Ringel & Sturm, 2001). For example, a 2002 study by Kataoka,
Zhang, and Wells described the percentages of children who accessed mental health
services across three cross-sectional, nationally representative samples comprised of
more than 11,500 households. In their sample, it was found that 15-21% of youth ages 617 had a mental health problem, while only 6-7.5% of those same youth (or 29-49% of
the entire sample) were receiving some form of mental health treatment. One reason for
this may be an inability to pay for such care, as it is estimated that 1 in 7 adolescents lack
health insurance or third-party reimbursement for mental health services in the private
26
27
through the education sector, while only 7% pursued these services through specialty
mental health providers, or in a medical facility (4%). These results suggest that schools
may be seen as the primary source for mental health services for children and youth. In
addition, school-based early intervention programs have been found to be effective in
reducing delinquent behavior in adolescents (Crespi & Rigazio-DiGilio, 1996),
suggesting that schools may be an ideal location for the provision of psychological
services (Crespi & Fischetti, 1997). Developmental research highlights the fact that
student mental health and psychological well-being are necessary conditions for
educational success at school (Haertel, Walberg, & Weinstein, 1983; Wang, Haertel, &
Walberg, 1990). As these findings have become more well-known, schools have
responded by becoming the default provider of mental health services for most children
and adolescents (Hoagwood & Johnson, 2003).
The provision of counseling and mental health services in schools is also
mandated by federal legislation. For example, according to IDEA, counseling and
psychological services are two related services that schools must provide to students, if it
is found that these services are necessary for students with disabilities to benefit from
special education (Wright & Wright, 2009). These authors have explained specific legal
requirements related to the provision of counseling in schools. Counseling services can
only be provided by social workers, psychologists, guidance counselors, or other
qualified professionals. The legal definition of psychological services covers a variety of
responsibilities carried out by the school psychologist. Responsibilities specifically
related to counseling include administering and interpreting assessments, obtaining and
sharing information about a childs behavior and conditions necessary for learning,
28
29
commonly utilized format (Suldo et al., 2010), with a range of 53-74% of school
psychologists indicating that they offered this service (Hanchon & Fernald, 2011; Yates,
2003). The average number of student groups counseled each year was 8.8 (Curtis et al.,
2008), with group sizes ranging from 2-4 students (Yates, 2003). Respondents reported
seeing 1-5 groups each week, while offering each group between 5-16 sessions (Yates,
2003). Specific issues addressed during group counseling sessions included social skills
development, anger management, study skills, anxiety, grief, and organizational skills
(Suldo et al., 2010).
The second most commonly used counseling format was individual counseling
(Suldo et al., 2010), with 61-88% of school psychologists providing this service
(Hanchon & Fernald, 2011; Yates, 2003). School psychologists met with an average of
9.9 students each year for individual counseling (Curtis et al., 2008). Students received 5
or more sessions each year, with session lengths ranging from 30-45 minutes each (Yates,
2003). Specific behaviors addressed using individual counseling included crisis
intervention, suicide assessment and intervention, threat assessment, de-escalation, and
other various intervention components (Suldo et al., 2010). A range of 32-52% of those
sampled administered classroom counseling sessions (Hanchon & Fernald, 2011; Yates,
2003), making this the third most common counseling format. Issues addressed during
classroom sessions included teaching social skills, family issues, girls issues, violence
prevention, study skills, art and music therapies, and self-esteem issues (Yates, 2003).
Respondents from two of the surveys also provided family counseling (8-31%; Hanchon
& Fernald, 2011; Yates, 2003). In addition, respondents to the Hanchon and Fernald
survey (2011) also provided crisis response counseling (51%), individual counseling with
30
31
32
school setting (Crespi & Fischetti, 1997; Crespi & Rigazio-DiGilio, 1996).
These factors have prompted school psychologists to offer counseling and mental
health services on a direct and indirect basis (Curtis et al., 2008; Hanchon & Fernald,
2011; Suldo, Friedrich, & Michalowski, 2010; Yates, 2003). Professional organizations
have also responded by delineating training standards and guidelines (NASP, 2010), as
well as federal legislators who have passed legal mandates regulating these services
(Wright & Wright, 2009) in an attempt to address student mental health needs. Given the
needs of students, knowledge detailing the current counseling practices of school
psychologists is valuable. Determining whether school psychologists counseling
practices have adequately addressed student needs, however, may require establishing a
solid foundation of research documenting effective practices in counseling, and ensuring
that those practices and strategies proven to be effective are those being used.
Current Best Practices in Counseling
A variety of different authors and professional organizations provide guidance for
school psychologists as they implement direct and indirect mental health services. As an
example, NASP regularly publishes information detailing best practices which help to
translate research findings into steps that can be taken within the school setting. At this
time, data-based decision making and accountability are two practices that form the basis
for service delivery within the field of school psychology. These practices are also
heavily researched and a popular focus of discussion in relation to educational and mental
health interventions. Doll and Cummings (2008) discussed the importance of data-based
decision making and accountability in relation to the provision of population-based
mental health services, stressing that, based on an assessment of the needs of the school
33
population, school mental health teams should identify indicators of student emotional
well-being early in the process of planning services. In addition, once these indicators
have been specified, methods for regularly evaluating whether these objectives are being
met must also be determined. These authors recommended continuous and formative
assessment to inform the actions of mental health providers.
In their publication for the NASP School Psychology Forum, Coffee and RaySubramanian (2009) described the use of Goal Attainment Scaling (GAS) as one method
for regular progress monitoring of behavioral interventions that can be completed by a
variety of school professionals familiar with the student, or even by the student him- or
herself. According to these authors, additional benefits of using GAS include its utility as
a repeated measure to monitor student behavior on a daily or weekly basis due to its
sensitivity to small changes in behavior, as well as being a tool to evaluate the overall
effectiveness of a given intervention.
Furthermore, Doll and Cummings (2008) supported the exclusive selection and
use of evidence-based treatments as school psychologists provide direct and indirect
mental health services to students. In relation to this, as part of best practices related to
brief individual counseling, Murphy (2008) recommended developing clear and
meaningful goals with the student, while evaluating progress towards goals regularly
throughout the counseling process using feedback from the client, and by comparing
precounseling and postcounseling data gathered using observations, behavior rating
scales, grades, and discipline records.
Evidence-based practices/evidence-based interventions. Over the past 10
years, practitioners in the fields of mental health and education have expressed significant
34
35
Table 2
Definitions of Evidence-Based Interventions
Terminology from the Literature
Evidence-based practice in psychology
(APA, 2005)
Definition
the integration of the best available
research with clinical expertise in the
context of patient characteristics, culture,
and preferences (n. pag)
Empirically-supported treatments
(Ollendick & King, 2004)
Empirically-supported treatments
(Association for Behavioral and Cognitive
Table 2 continued
Terminology from the Literature
Therapies [ABCT] & Society of Clinical
Child and Adolescent Psychology
[SCCAP], 2010b)
Definition
conditions, like major depression, panic
disorder, or obsessive-compulsive disorder,
within a given population (n. pag)
37
38
independent of the treatment condition, but instead, are a consequence of the treatment
that explain why one treatment produces improved outcomes compared to another
(Kraemer, Wilson, Fairburn, & Agras, 2002).
These distinctions and terminology are important contextual factors within the
evidence-based practice movement. Their significance stems from a continued gap
between research and practice, or the difference between what clinical scientists know
about which treatments successfully reduce symptoms, and what clinicians and
practitioners are actually using when working with children and adolescents (Herschell,
McNeil, & McNeil, 2004; Kazdin, Kratochwill, & VandenBos, 1986; Weisz, Weiss, &
Donenberg, 1992). Researchers have cautioned against relying on the assumption that
treatments or interventions found to be successful using efficacy studies will also be
effective when used in routine practice settings (Hoagwood, Burns, Kiser, Ringeisen, &
Schoenwald, 2001). Although there are a variety of factors maintaining the gap between
research and practice, the increased dissemination of research detailing effectiveness
studies, the efficacy of interventions along with consideration of their generalizibility or
transportability to non-research settings, and practice guidelines related to mediation and
moderation of treatment outcomes are recommendations for narrowing this gap to ensure
that children and adolescents are receiving the highest quality treatment available
(Hoagwood, Burns, Kiser, Ringeisen, & Schoenwald, 2001; Silverman & Hinshaw,
2008).
Factors supporting the exclusive use of evidence-based practices. The
movement supporting the use of EBIs has been seen in several different fields, such as
medicine, education, social work, nursing, and dentistry (Kazdin, 2008). Nemade, Reiss,
39
and Dombeck (2007) described the role that insurance companies have played in
promoting the use of EBIs in mental health care because these treatments provide a
measure of accountability as they have scientific evidence supporting their use.
Furthermore, when compared to other treatments such as psychotherapy, EBIs tend to be
more short-term, while allowing a scientifically-based method for clinicians to justify the
number of sessions they will need to address a specific behavior or disorder. Within the
field of education, the federal No Child Left Behind Act (2002) specified that all school
practices must be based on scientifically-based research, the definition of which was
clarified by the U.S. Department of Education (2003) in the following way:
scientifically based research means there is reliable evidence that the program or
practice works (n. pag). In addition, professional organizations representing the fields
of school psychology and social work have affirmed their commitment to the use of EBIs
through various ethical standards and practice models described in Table 3.
Current evidence-based practices related to counseling. Research findings by
Kazdin (2003) indicated that counseling has been used as a feature of interventions for
many issues addressed in schools today, including externalizing and internalizing
problems, learning and mental disabilities, and with profound forms of psychopathology,
such as autism. In line with the movement promoting the use of EBIs, and in response to
research findings indicating that one of the factors preventing some school psychologists
from delivering counseling and mental health services is a lack of adequate training and
experience with these tasks (Suldo, Friedrich, & Michalowski, 2010), this section will
discuss best practices related to designing, implementing, and evaluating evidence-based
interventions.
40
Table 3
Clauses Recommending the Use of Evidence-Based Interventions
Professional Organization
National Association of School
Psychologists (NASP) Model for
Comprehensive and
Integrated School Psychological
Services (2010)
Recommendation
NASPs mission is accomplished through
identification of appropriate evidencebased education and mental health services
for all children; implementation of
professional practices that are empirically
supported, data driven, and culturally
competent; (p. 1)
School psychologists collect and use
assessment data to understand students
problems and to select and implement
evidence-based instructional and mental
health services (p. 4)
4.01 Competence
(c) Social workers should base practice on
recognized knowledge, including
empirically based knowledge, relevant to
social work and social work ethics.
41
42
Table 4
An Outline for Planning Interventions Aligned with the Problem-Solving Model
Problem-solving logic
What is the problem?
Intervention components
Behavioral definition
Baseline data
Problem validation
Why is it happening?
Goal setting
Intervention plan development
Measurement strategy
Decision-making plan
Did it work?
Progress monitoring
Formative evaluation
Treatment integrity
Summative evaluation
43
grounding interventions in a behavioral definition of the target behavior and the success
of the intervention designed to change that behavior (Baer, Wolf, & Risley, 1987; Deno,
1995; Flugum & Reschly, 1994; Reynolds, Gutkin, Elliot, & Witt, 1984; Steege &
Wacker, 1995). Behavioral definitions allow for a common understanding among those
involved in the intervention of when the target behavior does and does not occur, and as
such, are necessary for reliable measurement of the target behavior during
implementation (Kazdin, 1982; Steege & Wacker, 1995). Behavioral definitions must
satisfy the following three conditions, in that they must be: objective, or descriptive of
observable actions that can be seen or heard; clear and unambiguous enough so that
someone unfamiliar with the student or the intervention could repeat or accurately
summarize the behavioral definition; and, complete in describing examples and
nonexamples of the behavior so that anyone observing the childs behavior is able to tell
when the target behavior is and is not occurring (Hawkins & Dobes, 1977; Howell &
Nolett, 2000; Kazdin, 1982; Reschly, Tilly, & Grimes, 2000).
In addition to the intervention components listed in Table 4, Forman and Burke
(2008) provided additional recommendations designed to help improve the effectiveness
of counseling interventions. Once goals have been formulated, during the intervention
plan development phase, these authors propose that school psychologists conduct a
review of the literature pertaining to EBIs proven to be effective in remediating the
identified problem, and select an intervention from these sources. Forman and Burke
(2008) also suggested that school psychologists identify intervention implementers and
stakeholders, assess their perceptions, attitudes, and beliefs related to the intervention,
and develop administrative and stakeholder support for the intervention. In addition, they
44
45
other researchers, it may be helpful to look at the research literature for more specific
guidelines related to designing direct behavioral interventions and EBIs. With these
more specific guidelines in mind, researchers are better able to evaluate whether the gap
between research and practice (Herschell & McNeill, 2004; Kazdin, Kratochwill, &
VandenBos, 1986; Weisz, Weiss, & Donenberg, 1992) extends to school psychologists in
terms of their application of current research related to accountability and data-based
decision making in the provision of counseling services.
Designing and Evaluating Direct Interventions
This section will discuss literature on designing and implementing direct
interventions. The current status of research related to academic and behavioral
interventions is a necessary starting point for school psychologists looking to align their
work with students with best practices. Although, at this time, research describing
effective practices in terms of social-emotional-behavioral interventions is in its early
stages, available research guidelines reinforce the importance of repeated measures of
student behavior to inform interventions, as well as the use of evidence-based techniques
and strategies. As such, this section concludes with a review of evidence-based
interventions for behavioral issues commonly seen in children.
Comparing academic and social-emotional-behavioral interventions. Under
traditional assessment models, teachers would identify students displaying behavioral
problems and school psychologists would conduct evaluations for special education
eligibility (Kamphaus, DiStefano, Dowdy, Eklund, & Dunn, 2010) with the result that
only students demonstrating high levels of need would be provided with services (Cash &
Nealis, 2004). Large numbers of students with behavioral and emotional problems (Mills
46
et al., 2006), however, called into question the utility and effectiveness of the teacher
referral system for several reasons: teachers may not be adequately trained to recognize
developing problem behaviors; teachers vary in their ability to address problem
behaviors, leading to different rates of referral; some students are not identified in an
effective and efficient manner (Tilly, 2008); and, some teachers consider behavior
problems and difficulties with social-emotional adjustment to be beyond their area of
responsibility (Severson, Walker, Hope-Doolittle, Kratochwill, & Gresham, 2007).
A problem-solving approach to identifying and supporting students with
behavioral problems has been recommended as an alternative to the teacher referral
system (Tilly, 2008). Although there are a variety of problem-solving approaches in use
in different contexts, some common features have been noted across different models,
such as the importance of universal screening and periodic assessment (Schwanz &
Barbour, 2005). Despite the existence of multiple problem-solving approaches,
traditional school-based assessment practices have not always meshed well with the
problem-solving approach to assessment and data-based decision making (Gresham et al.,
2010). An example of this is the use of standardized ability and achievement tests and
behavioral measures that have proved useful for making eligibility decisions, but do not
possess the treatment validity needed to inform instruction, have not been found to be
feasible or designed to progress-monitor a students response to intervention (Fuchs &
Fuchs, 1998; Gresham, 2002; Gresham & Witt, 1997), and have not been designed to
measure response to intervention in order to make special education eligibility or exit
decisions (Gresham, 2007; Shinn, 2008).
Furthermore, traditional school-based assessment practices do not always fit in
47
with the RTI paradigm (Briesch, Chafouleas, & Riley-Tilman, 2010). According to the
RTI framework, problem behaviors must be systematically and proactively defined
through the screening process and then regularly measured as part of progress monitoring
to determine whether the intervention was successful or other remediation strategies are
necessary. Some potential problems arise, however, when an RTI paradigm is applied to
behavioral assessment, as many available psychometrically sound assessment measures
are not feasible for repeated administration with large groups of students because of their
length and the frame of reference raters must consider when providing responses
(Briesch, Chafouleas, & Riley-Tilman, 2010).
The current link between social-emotional-behavior (SEB) assessment and
intervention is tenuous (Merrell, 2010). Following the problem-solving model, the
strength of current SEB assessments lies in the practitioners ability to use these measures
to determine problem behaviors and the factors maintaining them, without providing
guidance on how to address these problems and determine whether what has been done
was effective (Merrell, 2010). Many available rating scales for common DSM-IV
disorders are in existence (Merrell, 2008; Pelham, Fabiano, & Massetti, 2005), but are not
sensitive enough to change to be useful as a repeated measure of the target behavior of an
intervention (Volpe & Gadow, 2010). This is especially true in cases when the target
behavior is not the sole focus of the rating scale, or when a given scale is not best suited
to the referral concern (Volpe & Gadow, 2010). In addition, limited time and resources
may prevent practitioners from collecting enough data to make informed decisions
related to student behavior, underscoring the need for continued research and
development of feasible and psychometrically sound behavioral assessment measures
48
49
50
Problemsolving logic
What is the
problem?
Intervention
component
Behavioral
definition
50
Table 5
51
Table 5 continued
ProblemIntervention
solving Logic Component
Baseline
data
51
(table continues)
52
Why is it
happening?
Problem
Analysis
steps
Problem
Validation
Table 5 continued
ProblemIntervention
solving Logic Component
52
53
What should
be done
about it?
Goal Setting
Table 5 continued
ProblemIntervention
solving Logic Component
53
(table continues)
54
Table 5 continued
ProblemIntervention
solving Logic Component
54
(table continues)
55
Intervention
Plan
Table 5 continued
ProblemIntervention
solving Logic Component
55
(table continues)
56
Measureme
nt Strategy
Table 5 continued
ProblemIntervention
solving Logic Component
56
(table continues)
57
Decisionmaking Plan
Table 5 continued
ProblemIntervention
solving Logic Component
57
(table continues)
58
Table 5 continued
ProblemIntervention
solving Logic Component
58
(table continues)
59
Did it work?
Progress
Monitoring
Table 5 continued
ProblemIntervention
solving Logic Component
59
60
Formative
Evaluation
Table 5 continued
ProblemIntervention
solving Logic Component
60
61
Treatment
Integrity
Table 5 continued
ProblemIntervention
solving Logic Component
61
(table continues)
62
Summative
Evaluation
Table 5 continued
ProblemIntervention
solving Logic Component
62
63
Table 5 continued
ProblemIntervention
solving Logic Component
63
64
are also unable to design formative assessment measures or determine decision rules
specifying what an appropriate response to the intervention would look like (Chafouleas,
Volpe, Gresham, & Cook, 2010).
A variety of different methods of behavioral assessment are in existence, each
with its own set of strengths and weaknesses with respect to psychometric defensibility,
flexibility, feasibility, and repeatability (Chafouleas, Volpe, Gresham, & Cook, 2010).
As such, there is no single best assessment method, and while a combination of different
methods may be the best approach, at this point in time, researchers have not yet defined
a clear set of guidelines based on empirical evidence (Chafouleas, Volpe, Gresham, &
Cook, 2010). Some researchers have proposed academic engagement as a target behavior
serving as the foundation of general outcome measures (Briesch, Chafouleas, & RileyTilman, 2010). Consensus has not been reached, however, on the definition of a general
outcome measure for behavior in school-based assessment, or whether it is possible to
establish such a construct in behavioral assessment, and as such, discussion of the most
appropriate behavioral targets to measure using the problem-solving model is on-going
(Chafouleas, Volpe, Gresham, & Cook, 2010).
Another deviation from academic interventions involves the difficulty inherent in
measuring the growth or development of a specific skill related to a target behavior
(Chafouleas, Volpe, Gresham, & Cook, 2010). Unlike academic domains, benchmarks
for desirable or appropriate behaviors have not been established or agreed upon
universally. For some students, no change in behavior is desirable, while for others,
desirable behavior is explicitly tied to the target behavior and the context in which it
occurs (Chafouleas, Volpe, Gresham, & Cook, 2010). Furthermore, visual analysis of
65
66
67
67
CBT
Individual CBT with parents (Cornwall, Spence, & Schotte, 1996)
Individual CBT with cognitive parent training (Nauta, Schooling, Emmelkamp, &
Minderaa, 2003)
Group CBT with parental anxiety management for anxious parents (Cobham, Dadds, &
Spence, 1998)
Family CBT (Bogels & Siqueland, 2006; Wood, Piacentini, South-Gerow, Chu, &
Sigman, 2006)
Parent group CBT (without youth involvement) (Mendlowitz, Manassis, Bradley,
Scapillato, Miezitis, & Shaw, 1999; Thienemann, Moore, & Tompkins, 2006)
Group CBT with parents plus internet (Spence, Holmes, March, & Lipp, 2006)
Behavior/Intervention
Table 6
(table continues)
Possibly Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)
Probably Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)
Classification
68
Behavior/Intervention
Classification
68
Probably Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)
(table continues)
Well-Established
(Silverman, Pina, &
Viswesvaran, 2008)
Possibly Efficacious
(Barrett, Farrell, Pina,
Peris, & Piacentini, 2008)
Probably Efficacious
(Barrett, Farrell, Pina,
Peris, & Piacentini, 2008)
School Refusal
CBT
Possibly Efficacious
Individual CBT for school refusal (Heyne, King, Tonge, Rollings, Young, Pritchard et al., (Silverman, Pina, &
Viswesvaran, 2008)
2002; Last, Hansen, & Franco, 1998)
Individual CBT for school refusal with parent/teacher training (Heyne, King, Tonge,
Rollings, Young, Pritchard et al., 2002; King, Tonge, Heyne, Pritchard, Rollings, Young,
et al., 1998)
Parent/teacher training for school Refusal (Heyne, King, Tonge, Rollings, Young,
Pritchard et al., 2002)
Table 6 continued
69
Probably Efficacious
(Silverman, Pina &
Viswesvaran, 2008)
Classification
Possibly Efficacious
(Silverman, Pina, &
Viswesvaran, 2008)
69
Specific Phobia
CBT
Possibly Efficacious
(Silverman, Pina, &
Emotive imagery for SP of darkness (Cornwall, Spence, & Schotte, 1996)
Viswesvaran, 2008)
In-vivo behavioral exposures with EMDR for SP of spiders (Muris, Merckelbach,
Holdrinet, & Sijsenaar, 1998)
Exposures plus contingency management for SP (Silverman, Kurtines, Ginsburg, Weems,
Rabian, & Serafini, 1999)
Exposures plus self-control for SP (Silverman, Kurtines, Ginsburg, Weems, Rabian, &
Serafini, 1999)
(table continues)
Social Phobia
CBT
Group CBT for SOP (Social Phobia; Gallagher, Rabian, & McCloskey, 2003; Hayward,
Varady, Albano, Thienemann, Henderson, & Schatzberg, 2000; Spence, Donovan, &
Brechman-Toussaint, 2000)
Social Effectiveness Training for Children for SOP (Beidel, & Morris, 2000)
Behavior/Intervention
CBT
Resilient Peer Treatment (Fantuzzo et al., 1996; Fantuzzo, Manz, Atkins, & Meyers,
2005)
Group CGT (Deblinger, Stauffer, & Steer, 2001)
Cognitive Processing Therapy (Ahrens & Rexford, 2002)
Eye Movement Desensitization and Reprocessing (Chemtob, Nakashima, & Carlson,
2002; Jaberghaderi, Greenwald, Rubin, Zand, & Dolatabadi, 2004)
Client Centered Therapy (Cohen, Deblinger, Mannarino, & Steer, 2004)
Family Therapy (Kolko, 1996)
Child Parent Psychotherapy (Lieberman, Van Horn, & Ippen, 2005)
Table 6 continued
70
70
CBT
Penn Prevention Program (PPP) - including culturally relevant modifications as seen in
the Penn Optimism Program (POP; Gillham, Reivich, Jaycox, & Seligman, 1995; Jaycox,
Reivich, Gillham, & Seligman, 1994; Roberts, Kane, Thomson, Bishop, & Hart, 2003;
Yu & Seligman, 2002)
Self-Control Therapy (Stark, Reynolds, & Kaslow, 1987; Stark, Rouse, & Livingston,
1991)
Behavior Therapy (Kahn, Kehle, Jenson, & Clark, 1990; King & Kirschenbaum, 1990)
Depression Children
CBT
Individual CBT (Asarnow, Scott, & Mintz, 2002; Gillham, Reivich, Jaycox, & Seligman,
1995; Jaycox, Reivich, Gillham, & Seligman, 1994; Kahn, Kehle, Jenson, & Clark, 1990;
Nelson, Barnard, & Cain, 2003; Roberts, Kane, Thomson, Bishop, & Hart, 2003; Stark,
Reynolds, & Kaslow, 1987; Stark, Rouse, & Livingston, 1991; Weisz, Thurber, Sweeney,
Proffitt, & LeGagnoux, 1997; Yu & Seligman, 2002)
CBT group, child only (Gillham, Reivich, Jaycox, & Seligman, 1995; Jaycox, Reivich,
Gillham, & Seligman, 1994; Kahn, Kehle, Jenson, & Clark, 1990; Roberts, Kane,
Thomson, Bishop, & Hart, 2003; Stark, Reynolds, & Kaslow, 1987; Weisz, Thurber,
Sweeney, Proffitt, & LeGagnoux, 1997; Yu & Seligman, 2002)
CBT child group, plus parent component (Asarnow, Scott, & Mintz, 2002; Stark, Rouse,
& Livingston, 1991)
Behavior/Intervention
One-session exposure treatment for SP (Ost, Svensson, Hellstrom, & Lindwall, 2001)
One-session exposure treatment with parents for SP (Ost, Svensson, Hellstrom, &
Lindwall, 2001)
Table 6 continued
(table continues)
Probably Efficacious
(David-Ferndon, &
Kaslow, 2008)
Classification
71
71
CBT
Adolescent group CBT, plus parent component (Clarke, Hawkins, Murphy, Sheeber,
Lewinsohn, & Seeley, 1995; Clarke, Rohde, Lewinsohn, Hops, & Seeley, 1999;
Kowelenko et al., 2005; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn,
Clarke, Rohde, Hops, & Seeley, 1996)
Individual CBT (Rossello & Bernal, 1999; Wood, Harrington, & Moore, 1996)
Individual CBT, plus parent/family component (Brent et al., 1997; Melvin, Tonge, King,
Heyne, Gordon, & Klimkeit, 2006; Treatment for Adolescents with Depression Study
(TADS) Team, 2004)
Adolescents Coping with Depression (CWD-A; Clarke, Hawkins, Murphy, Sheeber,
Lewinsohn, & Seeley, 1995; Clarke et al., 2001; Clarke, Rohde, Lewinsohn, Hops, &
Seeley, 1999; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn, Clarke, Rohde,
Hops, & Seeley, 1996; Rohde, Clarke, Mace, Jorgensen, & Seeley, 2004)
Interpersonal Psychotherapy (IPT)
IPT for Depressed Adolescents (IPT-A; Mufson, Dorta, Wickramaratne, Nomura, Olfson,
& Wiessman, 2004; Mufson, Weissman, Moreau, & Garfinkel, 1999
(table continues)
Probably Efficacious
(David-Ferndon, &
Kaslow, 2008)
Behavior/Intervention
Classification
Depression Adolescents
CBT
Well-Established (DavidGroup Cognitive Behavior Therapy, adolescent only (Clarke, Hawkins, Murphy, Sheeber, Ferndon, & Kaslow, 2008)
Lewinsohn, & Seeley, 1995; Clarke, Rohde, Lewinsohn, Hops, & Seeley, 1999;
Kowlenko et al., 2005; Lewinsohn, Clarke, Hops, & Andrews, 1990; Lewinsohn, Clarke,
Rohde, Hops, & Seeley, 1996; Reynolds & Coats, 1986)
Table 6 continued
72
72
Behavior Therapy
Helping the Noncompliant Child (Peed, Roberts, & Forehand, 1977; Wells & Egan,
1988)
Triple P (Positive Parenting Program) - Standard (Bor, Sanders, & Markie-Dadds, 2002;
Sanders, Markie-Dadds, Tully, & Bor, 2000); Enhanced (Bor, Sanders, & Markie-Dadds,
2002; Sanders, Markie-Dadds, Tully, & Bor, 2000)
Incredible Years - Parent training (Webster-Stratton & Hammond, 1997; WebsterStratton, Reid & Hammond, 2004); Child training (Webster-Stratton & Hammond, 1997;
Webster- Stratton, Reid & Hammond, 2001; Webster-Stratton, Reid, & Hammond, 2004)
CBT
Anger Control Training (Lochman, Coie, Underwood, & Terry, 1993; Robinson, Smith,
& Miller, 2002)
Rational-emotive mental health program (Block, 1978)
Behavior/Intervention
Child and Adolescent ADHD
Behavioral parent training (BPT; Barkley et al., 2000; Bor, Sanders, & Markie-Dadds,
2002; Hoath & Sanders, 2002; MTA Cooperative Group, 1999; Sonuga-Barke, Daley,
Thompson, Laver-Bradbury, & Weeks, 2001)
Behavioral classroom management (BCM; Barkley et al., 2000; Miranda, Prescentacion,
& Soriano, 2002; MTA Cooperative Group, 1999)
Behavioral peer interventions (BPI; Pelham et al., 2000)
Table 6 continued
(table continues)
Probably Efficacious
(Eyberg, Nelson, & Boggs,
2008)
Well-Established (Eyberg,
Nelson, & Boggs, 2008)
Well-Established (Pelham
& Fabiano, 2008)
Classification
73
73
CBT
Group Anger Control Training (Feindler, Marriot, & Iwata, 1984)
Reaching Educators, Children, and Parents (RECAP; Weiss, Harris, Catron, & Han,
2003)
Behavior Therapy
Triple P (Positive Parenting Program) - standard group treatment (Leung, Sanders,
Leung, Mak, & Lau, 2003)
First Step to Success Program (Walker, Kavanagh, Stiller, Golly, Severson, & Feil, 1998)
Self-administered Treatment, plus Signal Seat (Hamilton & MacQuiddy, 1984)
Incredible Years Parent Training and Child Training (Webster-Stratton & Hammond,
1997); Parent Training and Teacher Training (Webster-Stratton, Reid, & Hammond,
2004); Incredible Years Parent Training, Teacher Training, and Child Training
Behavior/Intervention
Parent-Child Interaction Therapy (Nixon, Sweeney, Erickson, & Touyz, 2003;
Schuhmann, Foote, Eyberg, Boggs, & Algina, 1998)
Problem-Solving Skills Training Standard (Kazdin, Bass, Siegel, & Thomas, 1989;
Kazdin, Esveldt-Dawson, French, & Unis, 1987b; Kazdin, Siegel, & Bass, 1992);
Problem-Solving Skills Training and Practice (Kazdin, Bass, Siegel, & Thomas, 1989);
Problem-Solving Skills Training and Parent Management Training (Kazdin, EsveldtDawson, French, & Unis, 1987a)
Group Assertiveness Training - Counselor-led (Huey & Rank, 1984); Peer-led (Huey &
Rank, 1984)
Multidimensional Treatment foster care (Chamberlain & Reid, 1998; Leve, Chamberlain,
& Reid, 2005)
Multisystemic Therapy
Multisystemic Therapy (Borduin et al., 1995; Henggeler, Melton, Brondino, Scherer, &
Hanley, 1997; Henggeler, Melton, & Smith, 1992; Henggeler, Pickrel, & Brondino, 1999)
Table 6 continued
(table continues)
Possibly Efficacious
(Eyberg, Nelson, & Boggs,
2008)
Classification
74
74
Behavior/Intervention
(Webster-Stratton, Reid & Hammond, 2004); Teacher Training and Child Training
(Webster-Stratton, Reid & Hammond, 2004)
Table 6 continued
Probably Efficacious
(Waldron & Turner, 2008)
Well-Established (Waldron
& Turner, 2008)
Classification
75
75
Autism
Behavior Therapy
Lovaas' Method (Cohen, Amerine-Dickens, & Smith, 2006; Eikeseth, Smith, Jahr, &
Eldevik, 2002; Lovaas, 1987; Smith, Lovaas, & Lovaas, 2002)
Parent Training (Aldred, Green, & Adams, 2004; Drew et al., 2002; Jocelyn, Casiro,
Beattie, Bow & Kneisz, 1998)
CBT
Child and Family-Focused CBT (West et al., 2009)
Dialectical Behavior Therapy (Goldstein, Axelson, Birmaher & Brent, 2007)
Psychotherapy
Individual Family Psychoeducation (Young & Fristad, 2007)
Behavior/Intervention
Adolescent Bulimia Nervosa
CBT
Guided self-care for binge-eating in BN (Schmidt et al., 2007)
Family Therapy for BN (Le Grange, Crosby, Rathouz, & Leventhal, 2007)
Table 6 continued
Possibly Efficacious
(Rogers & Vismara, 2008)
Well-Established (Rogers
& Vismara, 2008)
Probably Efficacious
(Association for
Behavioral and Cognitive
Therapies [ABCT] &
Society of Clinical Child
and Adolescent
Psychology [SCCAP],
2010b)
Possibly Efficacious
(Association for
Behavioral and Cognitive
Therapies [ABCT] &
Society of Clinical Child
and Adolescent
Psychology [SCCAP],
2010b)
Classification
Table 7
Criteria for Classifying Evidence-Based Psychosocial Treatments
Well-Established Treatments
There must be at least two good group-design experiments, conducted in at least two
independent research settings and by independent investigatory teams, demonstrating
efficacy by showing the treatment to be:
a) statistically significantly superior to pill or psychological placebo or to another
treatment
OR
b) equivalent (or not significantly different) to an already established treatment in
experiments with statistical power being sufficient to detect moderate differences
AND
Treatment manuals or logical equivalent were used for the treatment
Conducted with a population, treated for specified problems, for whom inclusion
criteria have been delineated in a reliable, valid manner
Reliable and valid outcome assessment measures, at minimum tapping the problems
targeted for change were used, and
Appropriate data analyses
Probably Efficacious Treatments
There must be at least two good experiments showing the treatment is superior
(statistically significantly so) to a wait-list control group
OR
One or more good experiments meeting the Well-Established Treatment Criteria with
the one exception of having been conducted in at least two independent research
settings and by independent investigatory teams
Possibly Efficacious Treatments
At least one good study showing the treatment to be efficacious in the absence of
conflicting evidence
Experimental Treatments
Treatment not yet tested in trials meeting task force criteria for methodology
Adapted from the Division 12 Task Force on Psychological Interventions reports
(Chambless et al., 1996, 1998), from Chambless and Hollon (1998), and from Chambless
and Ollendick (2001).
76
CBT and BT, other therapies that have been determined to be well-established include
Family Therapy, Multidimensional Family Therapy, Functional Family Therapy,
Interpersonal Therapy, and Behavioral Parent Training, Behavioral Classroom
Management, and Behavioral Peer Interventions.
One reason for the high level of empirical support for CBT and BT techniques
could be related to the overlap between best practices in counseling and central tenets of
these treatments. For example, cognitive-behavioral practices are grounded in
empirically validated psychological theories, such as those related to learning and
cognition. The selection of specific treatment strategies is based on specific
characteristics of the child, and EBIs that have been shown to effectively address the
behaviors being displayed. Empiricism is the foundation for cognitive-behavioral
practices. The description and treatment of problematic behaviors is done using objective
terms and definitions, measurable goals, and a quantitative analysis of behavior. A major
component of treatments using CBT involves gathering objective data before, during, and
after an intervention. On-going evaluation of treatment goals is essential, as decisions
governing further intervention are made by examining data demonstrating the efficacy of
strategies already in use.
Section summary. Information detailing the current mental health needs of
students and the connection between mental health and academic success has been
acknowledged by school psychologists (Haertel et al., 1983; Wang et al., 1990). Data
collected describing the current counseling practices of school psychologists detail their
efforts to address the needs of the students with whom they work. Evidence of the
commitment to meeting the behavioral needs of students can be seen through the research
77
focus on the development of best practices related to indirect and direct intervention,
particularly in the area of counseling, the commitment to the use of evidence-based
interventions and research in this area, and the application of the problem-solving model
to the design and implementation of behavioral interventions. If research describing best
practices, evidence-based interventions, and the application of the problem-solving model
are applied to the design and implementation of counseling interventions, school
psychologists would employ strategies proven to be effective in addressing behavioral
problems in a manner that allows them to continuously monitor whether their efforts are
impacting student behavior as intended. Although research in these areas is on-going, the
use of evidence-based interventions, and the tenets of regular progress-monitoring of
student behavior and data-based decision making inherent in the problem-solving model
allow school psychologists to hold themselves accountable for meeting the mental health
needs of students when implementing counseling interventions with students who display
problem behavior.
Legislation Impacting the Field of School Psychology
In a similar fashion to the way the field of school psychology has evolved, federal
education policy has developed and changed in response to the needs of students.
Pertinent legislation influencing the practice of school psychology has been shaped by
federal involvement regarding students with disabilities, and students who come from
socially disenfranchised or economically disadvantaged backgrounds. The federal
government has responded to reports indicating that these groups of students were not
being provided with appropriate educational opportunities by passing laws and
regulations to address these inequalities and ensure that certain standards of practice are
78
followed. Over time these laws have been amended, and the role of the school
psychologist has evolved in response to these changes. This section provides an
overview of how federal policy and agencies, cases, and expository reports have
influenced education in general and the practice of school psychology more specifically.
A summary of selected influential federal policies and agencies, cases, and expository
reports can be found in tables 8, 9, 10, and 11.
This selected review of federal legislation, court decisions, expository reports, and
agencies provides an important context for examining the current climate of education in
American schools. From the period of the 1960s to the 1980s, a series of expository
reports (e.g., A Nation at Risk, Time for Results) publicized weaknesses in the education
of students in American schools, while at the same time proposing standardized testing
and increased autonomy for schools, contingent on improved outcomes. Rulings in
several court cases have specified standards for assessing special education eligibility and
determining how best to educate students with disabilities (e.g., PARC v. Pennsylvania,
Mills v. Board of Education). In response to these decisions, the federal government has
passed a series of laws codifying these decisions, and as such, shaped the role of the
school psychologist (e.g., IDEA). Several agencies have been created in order to
disseminate information on effective instruction, monitor the allocation and use of federal
funding, and collect and organize information detailing student achievement (e.g., NAEP,
OERI, National Assessment Governing Board). Federal legislation has evolved from
providing funding for programs to benefit students from minority or economically
disadvantaged backgrounds, to making such funding contingent on improvements in
student outcomes. Increased government regulation and accountability for student
79
80
80
(table continues)
Stewner-Manzanares (1988):
Targeted aid to schools with large numbers of non-English-speaking students
Increased federal aid for compensatory programs for students in poverty by 23%
Crespino (2006):
Applied desegregation requirements to all schools in the U.S.
Prevented parents from choosing where to send their children if doing so meant
that schools would be racially imbalanced
Description
provided schools financial aid to be used to benefit economically disadvantaged
children
Table 8
81
81
Table 8 continued
Title and Year
Description
Education for All Handicapped Children Beyer (1989):
Act (1975)
Mandated that all children with disabilities between the ages of 5-21 be
provided with
a free appropriate education through the use of special education and related
services
tailored to meet their individual needs
Specified due process provisions to protect the rights of parents and guardians
and include them in educational decision-making
Provided increased financial assistance to States and localities to fund
categorical programs for children with disabilities
82
82
(table continues)
Description
House Committee on Education and Labor (1990):
Increased the amount of federal aid to schools documenting increases in student
achievement using test scores or other achievement measures
Mandated increased regulation from local districts and state departments of
education for schools unable to document gains
Increased funding for school-wide reforms
Merrell, Ervin, & Gimpel (2006):
Reauthorized the original provisions of EHA
Required transition services for students with disabilities
Added autism and traumatic brain injury to the list of federal disability
conditions for which special education services are provided
School of Public Health and Health Professions, University at Buffalo, (2005):
Defined Assistive Technology Devises and Services to be included in student
IEPs
Extended the Least Restrictive Environment clause requiring that, to the
maximum extent, students with disabilities be educated with their non-disabled
peers
Table 8 continued
Title and Year
Hawkins-Stafford School Improvement
Amendments (1988)
83
Table 8 continued
Title and Year
83
Description
Required that economically disadvantaged students be held to the same
standards as their peers
84
84
(table continues)
Description
Nelson & Weinbaum (2009):
Ruled that federal courts would uphold states school funding systems provided
that state education systems developed the basic skills necessary for students to
participate in a democratic society
Made states responsible for covering the cost of educational services mandated
by federal laws or federal court decisions
Specified that school funding decisions would no longer be handled by federal
courts, as educational quality and resources could no longer be considered
federal rights
Table 9
85
Table 9 continued
Case and Year
Larry P. v. Riles (1979, 1986)
85
Description
Jacob & Hartshorne (2007):
Established the legal precedent that standardized tests administered to children
from diverse cultural backgrounds must have been validated for this purpose
86
86
Description
Washington Research Project & NAACP Legal Defense and Educational Fund
(1969):
Reported a lack of evidence connecting Title I funding and increases in
academic achievement among students in poverty
Highlighted the misuse of Title I funds in several states where it had been found
that money had been disproportionately allocated to suburban districts
Report
Title I of the ESEA: Is it helping poor
children?
Table 10
87
87
Description
U.S. National Archives and Records Administration (2011):
Created in 1972
Serves as an accountability mechanism for federally funded education programs
through study of the connection between federal dollars and academic
achievement
Agency
National Institute of Education
Table 11
improvement are two factors that continue to impact the field of education in present
times.
Contemporary legislation. Shortly after taking office in 2001, one of George W.
Bushs first legislative actions was to propose the No Child Left Behind Act (NCLB;
2002), which would serve as a reauthorization of the ESEA and IASA. The NCLB
represented a continuation of several provisions from its previous versions, including
Title I, the 21st Century Schools Act, bilingual education, Title II grants funding
innovation, and a sizeable reading program, among others. Provided in Table 12 is a
summary of the key provisions of NCLB.
The provisions of NCLB signified the intent and commitment of the federal
government to measure and monitor the educational achievement of all students in
schools. At the same time, teachers and administrators were being held increasingly
responsible for demonstrating student progress and proficiency across a variety of subject
areas. In addition, parents were empowered with information and alternatives for
remediation to ensure that their children were being provided with quality instruction.
After a lengthy series of negotiations, the NCLB Act was passed in 2001. From
the beginning, various education interest groups, such as the National School Boards
Association, the American Association of School Administrators, the National Education
Association, and the National Conference of State Legislatures, have voiced concern over
this bill, citing the claim that school districts would not be able to meet the demands of
this piece of legislation given the limited federal funding it provided. In response to this
concern, the Bush administration maintained that, without federally mandated
accountability and assessment practices outlined in NCLB, states would continue the
88
Table 12
Key Provisions of NCLB
Accountability for Student Outcomes:
provided federal financing to bolster achievement using standards, assessments, and
accountability regulations
mandated that standards and assessments were to be applied to all students
specified accountability measures in the form of corrective actions for schools in need
of improvement
detailed a formula to determine how and when to take corrective action for schools
that failed to meet progress targets, such that:
a) by the year 2014, all students were expected to be performing at a proficient
level in reading, mathematics, and science
b) each school year, gains must have been made in student adequate yearly
progress such that 100% proficiency would be reached by 2014
c) the annual rate of progress would be calculated for aggregated as well as
disaggregated student groups based on income, race, gender, English language
proficiency, and special education classification, with the entire school considered
in need of improvement if any one of these groups were not meeting goals for
expected progress
Adequate Yearly Progress
A school receiving Title I funding that had not met AYP for two consecutive years
was to be referred to as a school in need of improvement:
the school was given the responsibility of writing a plan for improving students
educational progress
the local education agency provided the school with technical resources for plan
implementation
students were given the choice of transferring to another school within the district
that was not in need of improvement
If during the following year, the school was still not able to make AYP, it retained the
status of a school in need of improvement:
Students retained the option to transfer
Students from low socioeconomic backgrounds could receive supplemental
educational services, such as tutoring or remedial classes, from either a public or
private state-approved agency
When a school did not make AYP for four consecutive years:
The district enforced corrective actions, such as replacing staff or making
curricular changes
Parents were given the opportunity to send their children to a different school
In the event that AYP was not met for a fifth consecutive year, a restructuring plan
was implemented by the school district:
(table continues)
89
Table 12 continued
The school could reopen as a charter school
Staff members could be replaced
The leadership of the school could be turned over to the state or a private agency
Effective Instruction
stressed the use of effective instructional methods, particularly in reading, by
offering grants to fund research-based instructional programs
required that teachers meet certain training standards, including the completion of
a bachelors degree, demonstration of competency in specific areas of instruction,
and documentation that they had met their states requirements for licensure or
certification
required that paraprofessionals meet certain training standards, including the
completion of two years of college, or demonstration of their ability to support
student learning in reading, writing, and math
mandated that schools make public the certification status and educational
attainment of the teachers and paraprofessionals employed in their buildings
required that schools begin conducting yearly testing in reading, math, and
science for students in grades 3-8 to determine whether or not students were
meeting goals for AYP, with the overall goal that all students would be proficient,
or demonstrating grade-level competency, by the 2013-2014 school year
90
legacy of leaving behind students with disabilities and those from minority or
economically disadvantaged backgrounds. To provide extra funding, NCLB permitted
states to reallocate funding from non-Title I federal programs into their Title I budgets.
Additionally, the State and Local Flexibility Demonstration Act allowed states to redirect
administrative and activity funds from other ESEA programs into supplemental learning
programs that were specifically designed to help students make AYP. Overall, NCLB
increased Title I funding by 20% for schools in urban areas or areas with a high
concentration of students from economically disadvantaged backgrounds.
In order to meet yearly testing requirements, many states needed only to revise the
testing programs they had previously created using Goals 2000 funding to account for the
provision that all students in grades 3-8, rather than certain samples or benchmark grades,
were tested. By 2001, 49 states had written content standards and mandatory tests for
graduation and grade-level promotion. Despite this fact, the quality of proficiency
standards varied from one state to another (Nelson & Weinbaum, 2009). As a result,
variability was also found when determining AYP, resulting in a non-uniform distribution
of schools in need of improvement between and within states.
The most current revision to IDEA occurred in 2004, when the Individuals with
Disabilities Education Improvement Act of 2004 (IDEA 2004; Wright & Wright, 2009)
was signed into law (Merrell, Ervin, & Gimpel, 2006). Jacob and Hartshorne (2007)
provided information related to the passage of this latest revision. To guide their
amendments, Congress cited several notable research findings, which are listed in Table
13. This most current authorization of IDEA placed the onus on schools for providing
students with disabilities with effective early intervention services, evidence-based
91
Table 13
Research Findings Guiding IDEA 2004
The effective education of children with disabilities is achieved with a foundation
based on high achievement standards [20 U.S.C. 1400 (5)(A)]
Special education is a service and not a place [20 U.S.C. 1400 (5)(C)], and
therefore students with disabilities should be provided with access to the general
education curriculum in the regular education classroom[20 U.S.C. 1400 (5)(D)]
Funding should be provided for school-wide practices, evidence-based instruction in
reading, positive behavioral supports, and early intervention services [20 U.S.C.
1400 (5)(E)]
Data on the increasing diversity of the school-aged population highlights the need for
more effective instruction for students with limited proficiency in English [20 U.S.C.
1400 (10)(A)]
92
93
information they need to evaluate school choice and contribute to school effectiveness; to
provide teachers and administrators with information on delivering effective instruction;
to determine and implement standards and assessments that will ensure that American
students are college- and career-ready; and, to provide students in under-performing
schools with support and interventions that will boost their educational achievement
(U.S. Department of Education, Office of Planning, Evaluation and Policy Development,
2010). Despite these goals, at this time, it is unknown whether evidence-based practices
and interventions are being consistently implemented for non-academic purposes.
Section summary. This section has reviewed past and current efforts by various
groups to address educational inequalities experienced by students with disabilities, and
students who come from socially disenfranchised or economically disadvantaged
backgrounds. An important theme that can be drawn from this review is that, over time,
weaknesses in the educational programming provided to different student groups have
been revealed, resulting in the federal government passing laws and regulations holding
school professionals accountable for demonstrating that these specific weaknesses have
been addressed. In contemporary times, more proactive measures have been
implemented, as school professionals are being held accountable for providing students
with quality educational experiences. Consequences are enforced when schools are
unable to demonstrate regular growth in student achievement. As an education
professional, school psychologists are now responsible for meeting the needs of a range
of students, as they address the educational and behavioral needs of the entire student
body through direct and indirect prevention and intervention efforts. Although current
legislation is still being drafted, it appears as though mandates for accountability, through
94
95
2003). Internet surveys may also provide researchers with access to participants who
oppose providing identifying or personal information, and the perceived anonymity
associated with internet surveys may yield more accurate responses (Granello &
Wheaton, 2004) as some participants consider this survey method to be more secure
compared to mail or telephone surveys (Granello & Wheaton, 2004; Van Selm &
Jankowski, 2006).
In many cases, a reduction in survey implementation and response time for
internet surveys has been noted (Dillman, 2007; Evans & Mathur, 2005; Granello &
Wheaton, 2004), as researchers are not encumbered by the time it takes for participants to
mail in their responses (Vaux & Briggs, 2006). Less time is also required when sending
follow-up requests for participation, which may result in higher response rates within a
specified timeframe for conducting research (Evans & Mathur, 2005). Online surveys
can also be designed to require participants to provide responses for all items before
submitting their surveys (Evans & Mathur, 2005). The survey designer can program a
skip pattern, preventing respondents from having to respond to irrelevant items (Dillman,
2007). In addition, the designer can program pop-up instructions for immediate
assistance rather than referring the respondent to a separate set of instructions removed
from the actual item (Dillman, 2007). Drop-down boxes containing lengthy lists of
possible responses can be used to make coding of answers easier for the researcher
(Dillman, 2007). Responses can be used to screen respondents (Alreck & Settle, 2004;
Dillman 2007) and automatically direct them to the next most relevant set of items
(Dillman, 2007).
Some respondents perceive internet surveys to be more interesting when they are
96
designed to be interactive, as a variety of pictures, animation, and audio and video clips
can be incorporated into an electronic survey (Dillman, 2007; Sax et al., 2003). Internet
survey questions can also be tailored to change dependent on participant response (Evans
& Mathur, 2005). Furthermore, internet surveys can also be completed during the
respondents leisure time, whereas this is not always the case when researchers utilize
telephone survey methods (Sax et al., 2003).
Before deciding to use surveys as a method of data collection, there are several
limitations that researchers should consider. Once the survey has been designed and
administered, it cannot be changed or altered throughout the data collection process. In
addition, at the conclusion of data collection, researchers may discover that the sample
they created and surveyed did not match the population of interest (Mangione, 1995).
Survey questions must be general enough to facilitate comprehension by a large number
of respondents, and as a result, may omit questions of interest to the researcher and
certain respondents (Barribeau et al., 2005). Self-administered surveys also do not take
into account idiosyncrasies in context associated with each respondent, and cannot
always account for how this will impact the accuracy of their responses in reference to
the intentions of the researcher (Barribeau et al., 2005). Furthermore, respondents may
have difficulty aligning their views and experiences with the dichotomies or scales
presented to them as answer choices on surveys, which is a potential threat to validity
(Barribeau et al., 2005). Additional concerns related to self-administered surveys include
bias in the responding sample (i.e., purposefully falsifying responses, picking the
response that immediately comes to mind as opposed to the most accurate response, or
selecting the same response without considering the question), nonresponse to certain
97
98
professional organizations will not allow researchers to survey their membership without
obtaining some form of permission or initiating some form of relationship (Dillman,
2007).
Recent data gathered by the Nielsen Company (2008) estimated that 80% of
homes across the United States contain a computer (desktop or laptop), and among those,
91.6% have some form of internet access. Despite these findings, however, the
distribution of internet access is not uniform (Nielsen Company, 2008), may be limited
within various areas, cultures, and countries (Van Selm & Jankowski, 2006), and appears
to vary depending on a variety of factors. For example, internet access is correlated with
education level and the combined annual income of a given household, such that
increases in these factors also increased the likelihood of internet access (Chesley, 2006;
Nielsen Company, 2008; Redpath et al., 2006). Across the country, internet access is
lowest among Hispanic and African-American households (Sax et al., 2003), as well as
those in which the head of the household has not completed high school (Chesley, 2006;
Nielsen Company, 2008). Within the Southeast Central region of the United States,
encompassing Alabama, Mississippi, Tennessee, and Kentucky, the highest number of
households without internet access can be found (Nielsen Company, 2008). In contrast,
larger cities, such as Washington, DC, Norfolk, Salt Lake City, Boston, and Portland,
contain the highest percentage of homes with internet access (Nielsen Company, 2008).
Given this information, researchers are cautioned that, although electronic survey
methods provide less expensive access to a larger portion of the population, and more
households report computer and internet access, it is inaccurate to assume that a
nationally-representative section of the population has been sampled without collecting
99
and analyzing demographic data (Andrews et al., 2003; Vaux & Briggs, 2006) to avoid
sampling bias and error (Alreck & Settle, 2004).
When considering the use of survey research to gather information, researchers
are advised to study the population of interest in order to determine whether this
population has uniform internet access and has a history of using internet surveys
(Dillman, 2007). It is more likely that researchers will obtain acceptable response rates,
if their survey population has regular access to the internet and e-mail (Granello &
Wheaton, 2004; Pealer & Weiler, 2003). The potential for sample bias may be reduced
when, in their sampling practices, researchers account for differences in internet access
and use among different groups in the general population (Andrews et al., 2003; Vaux &
Briggs, 2006), and decide whether an internet based survey is the most appropriate
format for gathering information (Pealer & Weiler, 2003). If the use of an internet survey
is considered appropriate, researchers are advised to design internet surveys in such a
manner that they can be clearly read and interpreted (Dillman, 2007). In addition, it is
recommended that, before formal survey implementation, researchers pilot their surveys
with a small subset of the sample population to examine brevity, reliability, and validity
(Andrews et al., 2003; Granello & Wheaton, 2004).
Application of the Tailored Design Method (TDM; Dillman, 2007) when creating
internet surveys is one way that researchers can ensure that the questionnaires used in
their research can be clearly read and interpreted by respondents. The development of
TDM has occurred over time, guided by social exchange theory, and revised by research
results exploring specific aspects of the survey process and their effect on the quality and
quantity of responses (Dillman, 2007). Contemporary research on TDM has focused on
100
ways that the design and layout of internet surveys impact respondents. As a summary of
this research, Table 14 provides a list of guidelines from the TDM that researchers can
use to design surveys with minimal error to achieve accuracy and high response rates.
Although internet surveys are a popular and efficient method of collecting
information on a population of interest (Dillman, 2007; Heun, 2001; Jackson, 2003),
researchers are advised to consider several limitations and factors associated with this
survey method before implementing it (Dillman, 2007). For example, because internet
familiarity, use, and access are not uniform across the country (Dillman, 2007; Evans &
Mathur, 2005), researchers are cautioned to examine demographic characteristics related
to the population of interest to determine whether an internet survey is the most efficient
way to collect data (Andrews et al., 2003; Vaux & Briggs, 2006). When designing
internet surveys, attention should be paid to the use of words and symbols in order to
facilitate ease and accuracy of responses by participants (Dillman, 2007). Piloting the
survey with a small group of the target population is one way that researchers can obtain
valuable feedback related to the effectiveness of their surveys before engaging in largescale implementation (Andrews et al., 2003; Granello & Wheaton, 2004).
Chapter Summary
One common theme throughout the topics discussed in this chapter is the
evolution of educational practices in response to the needs of students in American public
schools. The position of the school psychologist emerged in response to the rapid
expansion of the school population in the early 1900s, and continued to evolve over time
as methods of problem-solving, training, and the daily responsibilities of the school
psychologist developed and changed to provide an optimal learning environment.
101
Table 14
Guidelines and Considerations for Creating Internet Surveys Following the TDM
Framework
1) Respondents were more likely to endorse items presented in a forced-choice as
opposed to a check-all format
2) For scalar questions with answer choices presented in a drop down box, display
all answer choices in a drop down box without requiring scrolling.
3) The use symbols can increase response rate by cueing respondents to attend to
specific elements of questions and response choices without adding words
(Christian & Dillman, 2004).
4) When designing items with specific instructions that deviate from the general
directions, it is recommended that those instructions be placed after the question
but before the answer choices.
5) Numbering items can prevent respondents from making response errors. Some
respondents consider unnumbered items at the beginning of a survey to be
practice items, and as such, do not always answer them (Dillman & Redline,
2004). Skip patterns can be programmed so respondents answer only relevant
items, and in these cases, numbers can cause confusion. In place of numbers,
some survey designers will signify items using symbols, such as a question mark
or asterisk. This may be confusing to respondents, as the omission of a culturally
understood guideline may require extra attention to figuring out the use and
meaning of the symbol. Survey designers are cautioned to evaluate the use of
numbers in survey construction and navigation, as they are a culturally understood
and reliable method of survey navigation.
6) Cultural expectations prompt respondents to assume that the most positive
categories will be placed at the top of vertical scales and to the left on horizontal
scales, while the most negative categories will be at the bottom or to the right
(Tourangeau, Couper, & Conrad, 2004). Respondents also tend to assume that the
middle option will represent the average or typical value (Tourangeau, Couper, &
Conrad, 2004).
7) Responding to scalar questions is more difficult when graphics or verbal
descriptions of the scale are taken out of the respondents visual display, requiring
the respondent to refer back to the question stem, and then to the response area.
8) Provide instructions that facilitate accurate responding the first time the
respondent attempts an item, as an error message containing corrective feedback
can cause the respondent to experience frustration, and discontinue responding.
9) To enable respondents to provide answers in the correct format without receiving
and error message, designers may need to provide multiple visual cues (e.g.,
appropriate answer space size, use of symbols instead of words, numbers clearly
associated with symbols).
102
Research and best practices in counseling now focus on helping school psychologists
deliver counseling interventions that are evidence-based (Silverman & Hinshaw, 2008),
using the problem-solving model (Upah, 2008). The literature on student mental health
cites not only the connection between emotional well-being and academic success
(Haertel et al., 1983; Wang et al., 1990), but also the importance of the problem-solving
model and data-based decision making when designing and implementing counseling
interventions (Miltenberger, 2005; Tilly & Flugum, 1995; Upah, 2008).
Federal education legislation has also evolved to meet the needs of students over
time, such as disparities in the quality of education as a result of economic disadvantage
(e.g., ESEA) or special needs (e.g., IDEA). Currently, however, it is no longer enough
for those working with children to strive to meet the needs of students, as those working
in both the public and private sector are being held increasingly more responsible for
using evidence-based practices and documenting accountability for their actions (Kazdin,
2008). Although, at this time, federal legislation holds school professionals accountable
for students educational outcomes (e.g., NCLB), in time such standards of accountability
may also explicitly apply to school psychologists as they provide counseling
interventions to improve students social-emotional and behavioral outcomes. Research
exists documenting evidence-based interventions, the problem-solving model, and databased decision making practices for counseling and behavioral interventions that may
also provide measures of accountability for these interventions. At this time, however,
the extent to which school psychologists use evidence-based interventions, the problemsolving model, and data-based decision making in their counseling interventions is
unknown.
103
Research Questions
This study set out to survey practicing school psychologists in an attempt to
identify (a) their general counseling practices, (b) their use of best practices related to the
problem-solving model, and (c) demographic variables that might impact their counseling
practices (e.g., training, years of experience, school size, other roles and responsibilities).
General counseling practices of school psychologists. The following questions
were designed to gather information related to the current counseling practices of school
psychologists.
1) What percentage of school psychologists provide group and/or individual
counseling? Are school psychologists counseling general or special education
students, or both?
2) Approximately how many students do school psychologists recommend
declassifying from counseling each year, and what reasons are most commonly
cited when making this recommendation?
3) Are there any demographic differences (e.g., training, years of experience, other
roles and responsibilities) related to school psychologists group and individual
counseling practices?
4) What type of training and professional development have school psychologists
received related to planning and implementing counseling as a direct social
emotional behavioral intervention?
School psychologists use of best practices related to the problem-solving
model. The following questions were developed to determine which aspects of the
problem-solving model are most often employed in the design and implementation of
104
105
CHAPTER 3: Methodology
Overview
The purpose of this chapter is to describe the methods and procedures that were
used in the current study. A description of the participants and instrumentation that were
used to gather information regarding the counseling practices of school psychologists is
provided.
Participants
Participants for the current study were selected from a random sample of school
psychologists who were listed in the registry of Nationally Certified School Psychologists
(NCSPs) on the National Association of School Psychologists (NASP) website
(www.nasponline.org). The original survey population was composed of 1,000 school
psychologists currently employed in public schools. A second sample of 500 school
psychologists was utilized, and gathered using the same method as the first sample.
Instrumentation
Several instruments were used in this study to explain to potential participants the
purpose and importance of this study and to gather information related to specific
counseling practices employed by school psychologists currently practicing in school
settings. Specific information describing each instrument is described in this section.
Survey. The survey employed in this study was modeled on a previous survey of
school psychology counseling practices (Yates, 2003), and was written using Psychdata
(www.psychdata.com), incorporating guidelines provided Dillman (2007) according to
the Tailored Design Method (TDM; see Appendix A). The use of TDM was meant to
maximize response rates while at the same time minimizing survey error.
106
107
solicited the participation of those who had not (Appendix C). This email contained the
same information as the original coverletter email, in terms of identifying the researcher
and affiliated institution, and informing participants of the purpose, significance,
anonymity, and procedural safeguards associated with the study.
Procedure
Prior to sending coverletter emails inviting potential respondents to complete the
survey, five school psychologists from the Albany, New York area previewed the
instruments. These school psychologists were asked to complete the survey and provide
feedback related to the instruments readability, structure, and clarity (Appendix D).
Based on this feedback, survey items were evaluated for content, design, and length.
All five school psychologists who were contacted participated in the pilot study.
Analysis of the feedback that was provided revealed that survey completion time ranged
from 10 to 20 minutes. The majority of the pilot subjects indicated that the questions and
terms used were clear and easy to understand. None of the questions was deemed
unnecessary or irrelevant. Provided in Table 15 is a summary of respondent comments
and suggestions that were considered in revisions to the survey before its final
implementation. Overall, the results of the pilot study were positive and supported the
decision to move forward with the study.
This study was then conducted via an online survey. The coverletter email was
sent to a random sample of school psychologists selected from the NASP NCSP
directory. This coverletter included a link to the survey created using PsychData, which
is a secure website that allows for the collection of research data while preserving
respondent anonymity. In order to maintain anonymity, specific identifying information,
108
Table 15
Summary of Feedback From Pilot Subjects
Comment/Suggestion
Differentiate between school
psychologists who provide crisis
based or more long-term counseling
Provide option for reporting the
number of students discontinued
from counseling in relation to the
number of students seen for
counseling each year
Change presentation of
psychologist:student (Item 9) as this
might be confusing
Items requesting the frequency of
certain counseling practices were
written such that respondents could
select more than one response
Changed
Yes/No
No
Rationale/Outcome
Subjects could specify their
counseling practices on Item 4, and
at the end of the survey
No
No
Yes
No
No
109
such as respondents names, was not requested. At the end of the survey, all respondents
were invited to follow a second link to enter a raffle for one of two $50 gift certificates as
compensation for their participation. Illustrated in Table 16 are the mailing and response
data.
Initially, survey invitations were sent to 1,000 NCSPs. Due to a low response
rate, a second sample of 500 NCSPs was gathered and sent an email invitation to
complete the survey. Both the original sample and the second sample received initial and
reminder email invitations. In an attempt to increase the response rate, the initial sample
received a second reminder email. Each mailing had a certain number of emails that
were returned to the sender as undeliverable. Additionally, some of those contacted
indicated that they (a) were retired, (b) did not provide counseling, and/or (c) did not
provide psychological services in a school and therefore did not complete the survey.
The total number of emails sent (n=1,500) was adjusted for the emails that were returned
(n=171) and the replies declining participation in the survey for one of the reasons listed
earlier in this paragraph (n=65). Dividing the total number of responses (n=283) by the
adjusted number of valid emails (n=1264) yielded a total response rate of 22.39%.
110
111
135
1,000
1,000
500
500
1,000
1,500
Initial Sample
Reminder Email
(9/12/11)
Second Sample
(9/19/11)
Second Sample
Reminder Email
(10/3/11)
Initial Sample
Reminder Email
(10/16/11)
Total
171
11
15
Returned
Undeliverable
n Sent
Mailing (Date
Sent)
Initial Sample
(8/2911)
Table 16
25
Retired
111
23
10
Do Not
Provide
Counseling
17
Not
Employed
in a School
283
54
18
50
64
97
Completed
Surveys
22.39%
5.53%
3.67%
10.46%
6.58%
11.43%
Return Rate
CHAPTER 4: Results
Overview
This study was conducted to gain information about the counseling practices of
school psychologists. This chapter summarizes the results of the survey data gathered in
response to the research questions presented in Chapter 2. A discussion of the respondent
characteristics and demographic data follows, along with the quantitative and qualitative
results of the survey.
Data Analysis Plan
Several methods of data analysis were used to address the research questions
posed in this study. Descriptive statistics (e.g., frequencies, percentages, and averages)
were calculated to report on the general counseling practices of school psychologists, and
their use of the problem-solving model and accountability. A series of chi-square
analyses were conducted to determine the presence of relationships between counseling
practices and use of the problem-solving model and accountability and the demographic
information describing the school psychologists in this sample. A table displaying the
specific data analysis procedure used for each survey item can be found in Appendix E.
Respondent Characteristics/Demographic Data
As discussed in Chapter 3, an online survey was created using Psychdata and
emailed to a national sample of 1,500 randomly selected Nationally Certified School
Psychologists (NCSPs). A total of 283 survey responses were analyzed, representing a
response rate of 22.39%. Demographic characteristics of the respondents are
summarized in Table 17. It should be noted that not all participants completed all of the
demographic information items, and that the rate of non-response is listed next to each
112
Table 17
Demographic Characteristics of Respondents
Variable (% of Non-Response)
Graduate Degrees Received
MA/MS
Certificate/Specialist
PhD/PsyD/EdD
Graduate Program Accreditation
NASP
APA
NCATE
Your State
Not Accredited
Course Topics Included in Graduate Coursework
Academic Interventions
Behavioral Interventions
Counseling and Psychotherapy With Children
Counseling Children With Developmental
Disabilities
Group Counseling
Multicultural Counseling
Years Since Last Degree Earned
0-5
6-10
>10
Years of Employment in a School Setting (1.1%)
0-5
6-10
>10
Grade Levels Served By School Psychologists
Elementary School Students
Middle/Junior High School Students
High School Students
Types of Schools Served (2.5%)
Rural
Suburban
Urban
Mixed
Other (e.g., reservation, department of defense
school, small city)
113
43
172
64
15.4
61.6
22.9
247
78
50
75
1
87.3
27.6
17.7
61.8
00.4
247
256
244
87.3
90.5
86.2
127
212
169
44.9
74.9
59.7
125
62
91
45.0
22.3
32.7
116
54
110
41.4
19.3
39.3
220
136
123
77.7
48.1
43.5
67
106
50
43
24.3
38.4
18.1
15.6
10
03.6
(table continues)
Table 17 continued
Variable (% of Non-Response)
Psychologist to Student Ratio (3.2%)
1:<500
1:500-999
1:1000-1499
1:1500-2000
1:>2000
Region of Employment (1.8%)
Northeast
Midwest
South
West
Themes of Professional Development Attended Over the
Past Five Years
No Child Left Behind
Academic and Behavioral Accountability
Provision of Counseling
Evidence-Based Behavioral Interventions
Data-Based Decision Making
Response to Intervention
Other (e.g., Autism, Crisis Response, Positive
Behavior Support, Ethics, Legal Mandates)
114
33
77
81
47
36
12.0
28.1
29.6
17.2
13.1
70
60
75
73
25.2
21.6
27.0
26.3
61
162
104
226
195
245
21.6
57.2
36.7
79.9
68.9
86.6
49
17.3
variable displayed. Percentages were calculated from the total number of participants
that provided demographic data on each item, and not from the overall number of
participants. The majority of respondents held certificate or specialist degrees (61.6%)
from NASP-approved (87.3%) and state-accredited (61.8%) graduate programs. Their
coursework included training related to academic (87.3%) and behavioral (90.5%)
interventions, counseling and psychotherapy with children (86.2%), and group counseling
(74.9%). Over half of the respondents (59.7%) also studied multicultural counseling as
part of their training. Almost half of the respondents (45%) completed their graduate
training within the last 5 years. Nearly equal numbers of respondents indicated that they
had been employed in a school setting for 5 years or less (41.4%) or had more than 10
years of experience (32.7%). Typical school settings were suburban (38.4%) elementary
(77.7%) schools, at a psychologist-to-student ratio of 1:500-999 (28.1%) or 1:1,000-1499
(29.6%). Respondents were employed in similar numbers across the different regions of
the United States (responses range from 21.6% to 27% depending on region). Within the
past five years, many had attended professional development related to Response to
Intervention (86.6%), Evidence-Based Behavioral Interventions (79.9%), Data-Based
Decision Making (68.9%), and Academic and Behavioral Accountability (57.2%). As
displayed in Table 18, in terms of time allocation, 42.2% of respondents spent between
25-50% of their time on assessment, and devoted 25% of their time or less to direct
interventions (80%), consultation and indirect services (70.9%), research (100%),
administration (96.4%), or systems-level activities (97.1%).
General Counseling Practices
In addition to demographic data, respondents were asked to provide general
115
116
5.93 (9.72)
7.41 (10.14)
Administration
Systems-level
Activities
(265)
97.1 (267)
6.4
100.0 (275)
3.3
0.0
(9)
(0)
26.5 (73)
116
(3)
2.19 (9.72)
Research
70.9 (195)
18.2 (50)
Other (e.g.,
paperwork)
3.55 (8.00)
98.2 (270)
1.1
Note: 2.8% non-response rate for each activity within this item
22.21 (14.07)
Consultation and
Indirect Services
80.0 (220)
(6)
16.4 (14.67)
Direct Interventions
0-25
28.4 (78)
0.7 (2)
0.4 (1)
0.0 (0)
0.0 (0)
1.8 (5)
1.8 (5)
2.2
M(SD)
42.67 (23.00)
Activity
Assessment
Table 18
0.0 (0)
0.4 (1)
0.4 (1)
0.0 (0)
0.7 (2)
0.0 (0)
76-100
8.7 (24)
information related to their counseling practices. As shown in Table 19, the majority of
respondents (54.8%) provide group and individual counseling to both general and
special education students (73.5%). Some variation was noted in the average number of
students recommended for discontinuation from counseling each year, with respondents
recommending discontinuation at low (e.g., 0 [15.1%], 1 and 3 [13.8%], 2 [24.6%]
students) and high frequencies (e.g., >7 [12.5%] students). The most common reason for
recommending discontinuation from counseling was that counseling goals were met
(55.9%). Nearly half of the sample (44.4%) indicated using print or online resources
when writing behavioral goals, clarifying the problem, or when determining behavioral
expectations.
Comparison of Demographic Variables and General Counseling Practices
In addition to exploring specific counseling and discontinuation practices of
current school psychologists, one of the goals of this survey was to determine whether
group and individual counseling practices varied by demographic characteristics (e.g.,
training, years of experience, other roles and responsibilities). To determine this, four
multinomial logistic regression models were created to ascertain whether respondents
graduate degree (training), years of experience, and time spent counseling could predict
the type of counseling they engaged in, the students they served, the number of students
they discontinued each year, and their reasons for discontinuation. For the purposes of
analysis, the variable graduate degree was re-categorized from three to two categories,
such that the first category included respondents with a MA/MS or Certificate/Specialist
degree, and the second category included respondents with doctoral-level training (PhD,
PsyD, EdD). The number of students discontinued from counseling each year was also
117
Table 19
Counseling Practices of School Psychologists
Variable (% Non-Response)
Type of Counseling Provided (14.5%)
Group Counseling Only
Individual Counseling Only
Group and Individual Counseling
Children Served in Counseling (14.5%)
Special Education Students
General Education Students
Special and General Education Students
Number of Students Recommended for Discontinuing
Counseling Each Year (18%)
0
1
2
3
4
5
6
7
>7
Reasons for Discontinuing Counseling Services
Counseling Goals Have Been Met
No Positive Effect on Behavior
Student Leaves School/District
Parent Preference
Other (e.g., school policy, outside referral, student
need)
Use of Print/Online Resources for Writing Goals, Problem
Clarification or Determining Behavioral Expectations (12.4%)
Use Print or Online Resources
Do Not Use Print or Online Resources
118
19
68
155
07.9
28.1
54.8
62
3
177
25.6
01.2
73.5
35
32
57
32
21
22
2
2
29
15.1
13.8
24.6
13.8
09.1
09.5
00.9
00.9
12.5
127
24
26
5
55.9
10.6
11.5
02.2
45
19.8
110
138
44.4
55.6
119
Table 20
Multinomial Regression Predicting Type of Counseling from Graduate Degree, Years of
Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
122.991
Final
96.701
26.29
10
Graduate Degree
101.099
4.40
2
Years of Experience
110.014
13.31
4
Time Spent Counseling
104.223
7.52
4
2
Note: R =.105 (Cox and Snell), .129 (Nagelkerke), .066 (McFadden)
Sig
.003
.111
.010
.111
Table 21
Multinomial Regression Predicting Students Served in Counseling from Graduate
Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
76.18
Final
62.98
13.20
10
Graduate Degree
63.07
0.09
2
Years of Experience
73.82
10.84
4
Time Spent Counseling
65.57
2.59
4
2
Note: R = .054 (Cox and Snell), .075 (Nagelkerke), .044 (McFadden)
Sig
.213
.956
.028
.629
Table 22
Multinomial Regression Predicting the Number of Students Discontinued from
Counseling from Graduate Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
180.84
Final
131.42
49.42
15
Graduate Degree
135.87
4.46
3
Years of Experience
158.64
27.22
6
Time Spent Counseling
143.85
12.43
6
Note: R2= .195 (Cox and Snell), .214 (Nagelkerke), .090 (McFadden)
120
Sig
.000
.216
.000
.053
Table 23
Multinomial Regression Predicting Reasons for Discontinuation from Counseling from
Graduate Degree, Years of Experience, and Time Spent Counseling
-2 Log
Model/Effect
Likelihood
2
df
Intercept
161.58
Final
135.95
25.62
20
Graduate Degree
136.77
.82
4
Years of Experience
150.30
14.35
8
Time Spent Counseling
147.64
11.69
8
2
Note: R = .108 (Cox and Snell), .119 (Nagelkerke), .047 (McFadden)
Sig
.178
.925
.073
.166
Table 24
Chi Square Analysis Comparing Type of Counseling and Years of Experience
Variable
Group Counseling Only
Observed
Expected
Std. Residual
Individual Counseling Only
Observed
Expected
Std. Residual
Group and Individual Counseling
Observed
Expected
Std. Residual
Total
Note: 2= 15.83, df=4, Sig.=0.003
0-5
Years of Experience
6-10
>10
Total
10.0
8.1
0.7
1.0
3.7
-1.4
8.0
7.2
-0.3
19
19
19.0
28.6
-1.8
11.0
13.1
-0.6
37.0
25.3
2.3
67
67
74.0
66.2
1.0
103.0
35.0
30.2
0.9
47.0
46.0
58.5
-1.6
91.0
155
155
121
241
Table 25
Chi Square Analysis Comparing Number of Students Discontinued From Counseling
Each Year and Years of Experience
Number of Students
0
Observed
Expected
Std. Residual
1-3
Observed
Expected
Std. Residual
4-6
Observed
Expected
Std. Residual
7
Observed
Expected
Std. Residual
Total
Note: 2= 31.96, df=6, Sig.=0.000
0-5
Years of Experience
6-10
>10
Total
17.0
14.9
0.5
2.0
6.6
-1.8
16.0
13.4
0.7
35
35
60.0
51.6
1.2
29.0
22.9
1.3
32.0
46.4
-2.1
121
121
18.0
19.2
-0.3
10.0
8.5
0.5
17.0
17.3
-0.1
45
45
4.0
13.2
-2.5
99.0
3.0
5.9
-1.2
44.0
24.0
11.9
3.5
89.0
31
31
122
232
123
Table 26
Use of Intervention Components of the General Problem-Solving Model
Intervention component
Behavioral definition
Do Not Employ
This Component
% (n)
12.5 (31)
Did Not
Respond to this
Item % (n)
12.4 (35)
Baseline data
77.7 (188)
22.3 (51)
14.5 (41)
Problem validation
83.8 (196)
16.2 (38)
17.3 (49)
Problem analysis
69.2 (162)
30.8 (72)
17.3 (49)
Goal setting
81.3 (191)
18.7 (44)
17.0 (48)
Intervention plan
development
85.1 (194)
14.9 (34)
19.4 (55)
Measurement strategy
78.9 (176)
21.1 (47)
21.2 (60)
Decision-making plan
70.0 (156)
30.0 (67)
21.2 (60)
Progress monitoring
82.8 (173)
17.2 (36)
26.1 (74)
Formative evaluation
55.3 (114)
44.7 (92)
27.2 (77)
Treatment integrity
77.9 (159)*
22.1 (45)
27.9 (79)
Summative evaluation
69.3 (142)
30.7 (63)
27.6 (78)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond; * This item contained options for Sometimes (62.7% [128]) and Always
(15.2% [31]) measuring Treatment Integrity. For the purposes of analysis and reporting,
these categories have been combined.
124
frequency with which respondents reportedly apply specific aspects of the most common
components of the problem-solving model as they design and implement counseling as a
direct intervention. For data analysis purposes, the frequency categories of sometimes
and always used were combined. Frequency percentages for selected components were
calculated from the total number of participants that indicated using each corresponding
common component, and not from the overall number of participants.
Behavioral definition and baseline data collection. When writing behavioral
definitions, highest response rates were noted for action verbs describing student
behavior in observable terms (99.1%), and descriptions of the frequency (94.9%),
intensity (86%), and duration (80.2%) of the behavior. Lowest rates of endorsement were
provided for describing latency (52.1%) and accuracy (59.4%) of student behavior. The
most commonly used methods of baseline data collection were direct behavioral
observations (98.4%), third-party behavior ratings (99%), objective self-reports (87.1%),
and third-party interviews (90.3%). To determine a stable pattern of student behavior,
respondents reported collecting 3 to 5 baseline data points. Displayed in Tables 27, 28,
and 29 are the results for the use of these specific components.
Goals, intervention planning, measurement, and decision-making.
Determining what should be done to address the problem behavior involves goal setting,
developing an intervention plan, devising a measurement strategy, and coming up with a
plan for decision-making. Results showing the use of these specific components can be
found in Tables 30, 31, 32, and 33. Respondents indicated that they include timeframe
(92%), condition (97.8%), behavior (98.4%), and criteria (97.7%) some or all of the time
when writing a behavioral goal. When writing counseling intervention plans,
125
Table 27
Use of Specific Components of Behavioral Definition Composition
Employ This
Component
% (n)
Did Not
Respond
% (n)
99.1 (214)
23.7 (67)
94.9 (203)
24.4 (69)
52.1 (101)
31.4 (89)
86.0 (178)
26.9 (76)
61.0 (119)
31.1 (88)
59.4 (117)
30.4 (86)
126
Table 28
Use of Specific Types of Baseline Data Collection
Sometimes
Use This
Method
% (n)
31.9 (61)
Always Use
This
Method
% (n)
66.5 (127)
Did Not
Respond
% (n)
32.5 (92)
52.4 (100)
46.6 (89)
32.5 (92)
Sociometric techniques
60.7 (108)
5.1
(9)
37.1 (105)
60.2 (112)
30.1 (56)
34.2 (97)
68.1 (124)
22.0 (40)
35.7 (101)
Projective-expressive technique
34.6 (63)
02.2 (4) 35.7 (101)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond
Table 29
Average Number of Baseline Data Points Collected To Establish a Stable Pattern of
Student Behavior
Number of Points
Collected
% of Responses (n)
1
03.7 (07)
2
12.0 (23)
3
37.2 (71)
4
14.1 (27)
5
18.8 (36)
6
03.7 (07)
7 or Greater
10.5 (20)
Note: 32.5% (92) Non-Response rate
127
Table 30
Use of Specific Criteria When Writing Behavioral Goals
Sometimes
Use This
Criteria
% (n)
Always Use
This
Criteron
% (n)
Did Not
Respond
% (n)
36.3 (66)
56.6 (103)
35.7 (101)
37.2 (67)
60.6 (109)
36.4 (103)
19.8 (36)
78.6 (143)
35.7 (101)
128
Table 31
Use of Specific Components of Counseling Intervention Plans
Sometimes
Include This
Component
% (n)
Always Use
This
Component
% (n)
Did Not
Respond
% (n)
41.6 (77)
53.5 (99)
34.6 (98)
58.2 (107)
21.2 (39)
35.0 (99)
52.2 (95)
39.6 (72)
35.7 (101)
46.2 (85)
32.1 (59)
35.0 (99)
52.7 (96)
26.9 (49)
35.7 (101)
47.3 (86)
35.7 (65)
35.7 (101)
129
Table 32
Use of Specific Components for Measuring Target Behaviors
Sometimes
Use This
Component
% (n)
Always Use
This
Component
% (n)
Did Not
Respond
% (n)
19.3 (34)
79.0 (139)
37.8 (107)
37.5 (66)
57.4 (101)
37.8 (107)
39.1 (68)
56.9 (99)
38.5 (109)
37.1 (65)
58.3 (102)
38.2 (108)
38.5 (67)
55.7
38.5 (109)
Measurement Component
A behavioral definition of the target
behavior
(97)
130
Table 33
Use of Specific Decision-Making Components
Decision-Making Component
A determination of the frequency of
behavioral measurements and data to be
collected
Sometimes
Always
Include This Include This
Component Component
% (n)
% (n)
Did Not
Respond
% (n)
37.8 (56)
60.8 (90)
47.7 (135)
53.1 (78)
36.1 (53)
48.1 (136)
47.6 (70)
42.2 (62)
48.1 (136)
42.6 (63)
52.7 (78)
47.7 (135)
131
respondents describe the procedures to be used (95.1%), the steps and activities to be
completed during sessions (91.8%), the location where the intervention is to take place
(86.2%), and what each persons role in the session is to be (83%). Respondents
indicated that, when writing measurement plans, they include a behavioral definition of
the target behavior (98.3%), a description of where (94.3%), when (96%) and who
(95.4%) will measure the behavior, a description of the recording method (89.5%), and a
rationale for why the method is appropriate for the target behavior (94.2%). Decisionmaking plans reportedly specify the frequency with which behavioral data would be
collected (98.6%), how the data would be summarized and reported (89.2%), how many
data points would be collected (89.8%) and how much time would pass before
intervention data analysis (95.3%), and decision rules for responding to specific data
points (86.2%).
Progress monitoring, formative evaluation, treatment integrity, and
summative assessment. The final intervention components of the problem-solving
model are progress monitoring, formative evaluation, treatment integrity, and summative
evaluation. Data regarding the use of these specific components can be found in Tables
34, 35, 36, 37, 38, and 39. Commonly used methods of collecting progress monitoring
data include direct behavioral observation (98.9%), third-party behavior rating scales
(97.2%), interviews (96%), and objective self-report measures (88%). Variability was
noted in the number of progress monitoring data points respondents considered necessary
to establish a stable pattern of student behavior, with some respondents indicating that
they collect a range of 3 to 8 data points. Only one-third of respondents who engage in
baseline and progress monitoring data collection consistently use the same procedure
132
Table 34
Use of Specific Progress Monitoring Techniques
Sometimes
Use This
Technique
% (n)
43.0 (77)
Always Use
This
Technique
% (n)
55.9 (100)
Did Not
Respond To
This Item
% (n)
36.7 (104)
62.0 (111)
35.2 (63)
36.7 (104)
Sociometric techniques
59.5 (100)
Interviews
Objective self-report measures
1.8
(3)
40.6 (115)
57.6 (102)
38.4 (68)
37.5 (106)
71.3 (124)
16.7 (29)
38.5 (109)
Projective-expressive techniques
32.7 (56)
2.9 (5)
39.6 (112)
Note: Number of responses and percentages vary by item as some respondents elected
not to respond
Table 35
Number of Progress Monitoring Data Points Collected to Establish a Stable Pattern of
Student Behavior
Number of Data
Points Collected % of Item Responses (n)
1
01.7 (03)
2
06.7 (12)
3
28.7 (51)
4
13.5 (24)
5
13.5 (24)
6
10.1 (18)
7
05.1 (09)
8
01.1 (02)
>8
19.7 (35)
Note: 32.5% (105) non-response rate
133
Table 36
Use of the Same Method for Baseline Data Point and Progress Monitoring Data Point
Collection
Response Type
Yes
Sometimes, depending on the situation
No
Note: 36.4% (103) non-response rate
% of Item
Responses (n)
33.3 (60)
65.6 (118)
01.1 (02)
Table 37
Sources of Data Considered During Formative Assessment
Data Source
The level of the behavior (how much the behavior is
occurring during baseline and intervention phases as
judged by repeated, objective measurements of its
frequency, duration, intensity, or the percentage of
intervals in which it occurs)
Consider
This Data
Source
% (n)
Did Not
Respond
% (n)
90.0 (108)
57.6 (163)
94.1 (111)
58.3 (165)
95.0 (114)
57.6 (163)
90.0 (108)
58.0 (164)
134
Table 38
Use of Methods to Measure Treatment Integrity
Use This
Measurement
Method
% (n)
86.5 (135)
Did Not
Respond
% (n)
44.9 (127)
88.0 (139)
44.2 (125)
62.9 (95)
46.6 (132)
77.6 (121)
44.9 (127)
Measurement Method
Self-Report
135
Table 39
Sources of Data Considered During Summative Assessment
Data Source
The level of the behavior (how much the behavior is
occurring during baseline and intervention phases as
judged by repeated, objective measurements of its
frequency, duration, intensity, or the percentage of
intervals in which it occurs)
Consider
This Data
Source
% (n)
Did Not
Respond
% (n)
89.7 (130)
48.8 (138)
97.3 (143)
48.1 (136)
98.6 (144)
48.4 (137)
90.7 (127)
50.5 (143)
136
when collecting each type of data (33.3%), while a majority use the same procedure only
sometimes (65.6%). About half of respondents did not respond to questions regarding
formative assessment, treatment integrity, or summative assessment. Respondents that
collect formative assessment data do so by considering the level (90%) and trend (94.1%)
of the behavior, anecdotal descriptions (95%), their own subjective assessment (90%),
and data documenting student performance (95.8%). Treatment integrity was most
commonly measured using self-reports (86.5%), session logs (88%), and permanent
products of student work (77.6%). Similarities in the collection of formative (discussed
previously) and summative data were found, with anecdotal data (98.6%), data
documenting student performance (97.9%), and the trend of student behavior (97.3%)
being the most popular methods of summative assessment, followed by subjective
assessment (90.7%) and evaluation of the level of the behavior (89.7%).
Comparison of Demographic Variables and Use of the Problem-Solving Model
Respondent data on frequency of use of the major components of the problemsolving model were also compared with demographic variables (training, years of
experience, school type, other roles and responsibilities) to determine whether these
variables impacted design and implementation of counseling as a direct intervention. The
same re-coded variables (with the addition of school type and psychologist:student ratio)
as described previously were compared against frequency data on use of the major
components of the problem-solving model in 12 separate binary logistic regression
models.
Overall, these demographic variables were not found to offer predictive value for
determining use of the major components of the problem-solving model, as indicated by
137
insignificant overall statistics and low R2 values. Summary tables for these models can be
found in Appendix G. Despite these results, six instances were noted in which one or
more predictive factors made a significant contribution to the model. Therefore, 6 chisquare analyses were conducted to explore any possible relationships; no significant
relationships were found between these demographic variables and use of the problemsolving model (see Appendix F for these chi square tables).
Comments
At the end of the survey a section was created for participants to provide any
additional comments they might have had regarding the survey or research topic being
explored. A total of 58 respondents (20.49%) provided comments. Further analysis of
the comments revealed two themes. Some respondents chose to comment in order to
specify their counseling practices (e.g., do not counsel, offer less or limited counseling
than in previous years, employed in a specialized school with specific population of
students). Furthermore, of those who do not currently offer counseling, many indicated
that they based their responses on previous counseling experiences, or behavioral and
academic interventions they administer by themselves or as a member of a school-based
team. Other respondents critiqued the length and focus of the survey (e.g., focus on a
more narrow set of behaviors, offer more options for elaboration and context on specific
use of techniques).
138
CHAPTER 5: Discussion
Overview
The emphasis placed on accountability for educational and behavioral outcomes
in schools has increased over time as research continues to document student needs and
achievement. Current best practices in instruction and the design of behavioral and
counseling interventions specify how school psychologists can make data-based
decisions in their counseling practice (Upah, 2008), while federal education legislation
holds them accountable for their work with students (Wright & Wright, 2009). The
problem-solving model integrates research from response to intervention and singlesubject design paradigms with a focus on repeated, objective, and observable
measurements of student behavior over time to demonstrate the effectiveness of an
intervention. School psychologists were surveyed regarding their counseling practices,
with a specific focus on their implementation of research and best practice guidelines
related to the use of evidence-based interventions, progress monitoring, and data-based
decision making.
This chapter focuses on the discussion of the results of this study. The discussion
begins with consideration of respondent characteristics and how these demographics
compare with those of other research studies. The chapter continues with a review of the
findings within the major areas (i.e., counseling practices, use of general and specific
components of the problem-solving model) examined in the current study. In addition,
the implications of the results of this study for the field of school psychology, limitations
of the study, and potential directions for future research are discussed.
Respondent Characteristics
139
140
Table 40
Comparison of Demographic Characteristics
Variable
Graduate Degrees Received
Masters/Specialist
Doctorate
Psychologist to Student Ratio
1:<1,000
1:<1,500
1:1,500-2,000
1:>2,000
Type of School Served
Urban
Suburban
Rural
Themes of Professional Development Attended
Over the Past Five Years
Behavioral Interventions
Social-Emotional Intervention/Provision of
Counseling
Response to Intervention
141
Curtis et al.
(2008)
Current
Study
71.6%
24.4%
77%
22.9%
40%
65%
17%
18%
40.1%
69.7%
17.2%
13%
28.4%
50.2%
28.8%
18.1%
38.4%
24.3%
47.1%
79.9%
28.7%
26.3%
36.7%
86.6%
Hunley, 2004; Curtis, Hunley, & Grier, 2004) document the graying of the field of
school psychology, and warn of projected shortages due to retiring practitioners
beginning in 2010. The split in years of experience in the current sample could be
explained by an influx of recent graduates entering the field to replace those who have
already retired.
Without over-interpretation of data provided by Curtis et al. (2008), several
commonalities were noted between topics explored through professional development.
Given the fact that opportunities for professional development face a variety of
constraints (e.g., financial resources, time availability, location), it may be more valuable
to note similarities in topics, rather than agreement in percentages of attendance.
Furthermore, data from the Curtis et al. (2008) study were gathered during the 2004-05
school year, around the same time that the Individuals With Disabilities Education Act
(2004; Wright & Wright, 2009) specified that schools could use a students response to
intervention to determine eligibility for special education and related services. Current
legislation, such as the No Child Left Behind Act (NCLB, 2002), specifies accountability
for student outcomes and achievement as well as data-based decision-making.
Respondents in the current study continued to pursue professional development on
Response to Intervention (86.6%), along with evidence-based behavioral interventions
(79.9%), data-based decision-making (68.9%), and academic and behavioral
accountability (57.2%).
Consistency in time allocation was also found when respondent data from the
current study were compared to previous research. For example, in the current study,
42.2% of respondents reported spending between 25-50% of their time on assessment,
142
while allocating 25% of their time or less on direct interventions (80%), consultation and
indirect services (70.9%), research (100%), administration (96.4%), or systems-level
activities (97.1%). Demographic studies conducted over the past few decades mirror
these findings, with respondents indicating that they spend approximately 50% of their
time on assessment, 20-25% engaged in direct intervention, 20-25% on consultation, and
their remaining time involved in systems-level or research activities (Bramlett, Murphy,
Johnson, Wallingsford, & Hall, 2002; Fisher, Jenkins, & Crumbley, 1986; Goldwasser,
Meyers, Chistenson, & Graden, 1983; Hartshorne & Johnson, 1985; Lacayo, Sherwood,
& Morris, 1981; Meacham & Peckam, 1978; Reschly & Wilson, 1995; Smith, 1984).
Overall, the statistics describing graduate preparation, experience, and
employment conditions (school setting, psychologist-to-student ratio, time allocation) for
this study are comparable to current and historical demographic data. One difference that
should be noted was years of experience. This should be considered when interpreting
results and generalizing to the larger population. The results of this study may apply to
the general population given that few differences were found between this sample and
current demographic research. At this time, however, the small sample size of this study
limits the ability to generalize these results to any population beyond those school
psychologists who responded to the survey.
Counseling Practices of School Psychologists
The opening questions of this study were designed to gather data on the general
counseling practices of school psychologists. Specific areas of interest included
counseling format, students served, discontinuation from counseling, and the use of print
and/or online resources. The majority of respondents (54.8%) see special and general
143
education students (73.5%) for counseling in group and individual formats. Respondents
with the most experience were more likely than new practitioners to provide individual
counseling exclusively. It is important to note that 24 respondents indicated in the
comments section that they do not currently counsel students. Additionally, of the 65
respondents who declined to participate, 23 did so because they do not counsel within
their school buildings. Due to the narrow focus of this study on the application of the
problem-solving model to counseling as a behavioral intervention, limited comparisons
could be made between this study and the literature on counseling practices. As stated in
Chapter 2, previous research findings indicate that a range of 53-88% of school
psychologists provide group and/or individual counseling (Hanchon & Fernald, 2011;
Yates, 2003), and surveys of time allotment indicate that 20-25% of schools
psychologists time is spent on direct interventions (Bramlett et al., 2002).
To develop an understanding of data-based decision-making, several survey items
explored the discontinuation practices of school psychologists and their use of print or
online resources. Most respondents recommended discontinuing 2 students from
counseling each year, with the most common reason being that counseling goals have
been met. Experience was found to have an impact on discontinuation practices.
Practitioners with 10 or more years of experience were more likely than expected to
discontinue 7 or more students from counseling, while those with the least amount of
experience were less likely to discontinue students at this frequency. Similar numbers of
respondents reported that they did or did not use print or online resources to assist in
writing goals, clarifying problem behavior, and when setting behavioral expectations.
Data regarding discontinuation of counseling services have not been reported in the
144
available literature; however, due to the small sample size in this study, these results
should be viewed as preliminary.
Use of the Problem-Solving Model
The major focus of this investigation was to determine the frequency with which
school psychologists use components of the problem-solving model. To this end,
respondents were asked to provide data on their use of general and specific components
of the problem-solving model. Provided in Table 41 is a summary of the frequency of
use of these components. Overall, general components of the problem-solving model
used to define and establish the behavior of concern (e.g., behavioral definition [87.5%],
and problem validation [83.8%]), as well as those involved in determining what should
be done about it (goal setting [81.3%], and intervention plan development [85.1%]) were
found to be used most often. In addition, the collection of progress monitoring data
(82.8%) was another practice in which a majority of respondents reported engaging.
Social-emotional behavioral (SEB) research conducted by Merrell (2010) pointed
out that while there are a variety of SEB assessment tools available, their utility lies in
determining problem behavior and the factors maintaining these behaviors, and provide
only limited guidance on intervention and determining whether remediation efforts have
been successful. A majority of respondents in this study have also attended professional
development in recent years focused on evidence-based behavioral interventions, databased decision-making and academic and behavioral accountability, during which time
they may have received strategies and resources for monitoring student behavior.
Agreement was also found between general and specific components used by
these respondents and legal requirements. For example, as reiterated in IDEA 2004,
145
Table 41
Use of General and Specific Intervention Components of the Problem-Solving Model
General and Specific PSM Component
Behavioral definition
Action verbs describing behavior in observable terms
Describe frequency
Describe intensity
Describe duration
Describe topography
Describe accuracy
Describe latency
Intervention plan development
Describe the procedures to be used
Describe activities/steps to be completed in each
session
Specify location where intervention takes place
Describe what each person will do
Specify materials needed for each activity/step
Document that intervention is empirically valid
Describe how each activity/step will be completed
Problem validation
Progress monitoring
Direct behavioral observation
Third-party behavioral rating scales
Interviews
Objective self-report measures
Sociometric techniques
Projective-expressive techniques
Goal setting
Specify behavior
Specify condition (circumstance in which behavior
occurs)
Specify criteria (standard for behavioral performance)
Specify timeframe when expected progress will be
made
Measurement strategy
Behavioral definition of the target behavior
Describe when the target behavior will be measured
Describe who will measure the target behavior
Describe where the target behavior will be measured
Describe appropriate recording measure
146
% Usage (n)
87.5 (217)
99.1 (214)
94.9 (203)
86.0 (178)
80.2 (166)
61.0 (119)
59.4 (117)
52.1 (101)
85.1 (194)
95.1 (176)
% NonResponse (n)
12.4 (35)
23.7 (67)
24.4 (69)
26.9 (76)
26.9 (76)
31.1 (88)
30.4 (86)
31.4 (89)
19.4 (55)
35.0 (99)
91.8 (167)
86.2 (156)
83.0 (151)
79.6 (145)
79.4 (146)
78.3 (141)
83.8 (196)
82.8 (173)
98.9 (177)
97.2 (174)
96.0 (170)
88.0 (153)
61.3 (103)
35.6 (61)
81.3 (191)
98.4 (179)
35.7 (101)
36.0 (102)
35.7 (101)
35.7 (101)
35.0 (99)
35.0 (99)
17.3 (49)
26.1 (74)
36.7 (104)
36.7 (104)
37.5 (106)
38.5 (109)
40.6 (115)
39.6 (112)
17.0 (48)
35.7 (101)
97.8 (176)
97.7 (173)
36.4 (103)
35.7 (106)
92.9 (169)
78.9 (176)
98.3 (173)
96.0 (167)
95.4 (167)
94.9 (167)
89.5 (154)
35.7 (101)
21.2 (60)
37.8 (107)
38.5 (109)
38.2 (108)
37.8 (107)
39.2 (111)
(table continues)
Table 41 continued
General and Specific PSM Component
Treatment Integrity
Logs documenting sessions
Self-reports
Permanent products of student work
Checklists for intervention components
Direct observation by non-involved 3rd party
Baseline data
Third-party behavior ratings
Direct behavioral observation
Third-party interviews
Objective self-report
Sociometric techniques
Projective-expressive techniques
Decision-making plan
Frequency with which data will be collected
How much time will pass before intervention data
analysis
How many data points will be collected before data
analysis
How data will be summarized for intervention
evaluation
Decision rules for responding to specific data points
Summative evaluation
Anecdotal information
Data documenting student performance in school
Trend (change in level from baselines to intervention
phases)
Practitioners subjective assessment of student behavior
Level (comparison of behavioral occurrence in baseline
and intervention phases)
Problem analysis
Formative evaluation
Data documenting student performance in school
Anecdotal information
Trend (change in level from baselines to intervention
phases)
Practitioners subjective assessment of student behavior
Level (comparison of behavioral occurrence in baseline
and intervention phases)
147
% Usage (n)
77.9 (159)
88.0 (139)
86.5 (135)
77.6 (121)
62.9 (95)
53.8 (84)
77.7 (188)
99.0 (189)
98.4 (188)
90.3 (168)
90.1 (164)
65.8 (117)
36.8 (67)
70.0 (156)
98.6 (146)
% NonResponse (n)
27.9 (79)
44.2 (125)
44.9 (127)
44.9 (127)
46.6 (132)
44.9 (127)
14.5 (41)
32.5 (92)
32.5 (92)
34.2 (97)
35.7 (101)
37.1 (105)
35.7 (101)
21.2 (60)
47.7 (135)
95.3 (141)
47.7 (135)
89.8 (132)
89.2 (131)
48.1 (136)
48.1 (136)
86.2
69.3
98.6
97.9
48.8
27.6
48.4
50.2
(125)
(142)
(144)
(138)
(138)
(78)
(137)
(142)
97.3 (143)
90.7 (127)
48.1 (136)
50.5 (143)
89.7
69.2
55.3
95.8
95.0
48.8
17.3
27.2
58.0
57.6
(130)
(162)
(114)
(114)
(114)
(138)
(49)
(77)
(164)
(163)
94.1 (111)
90.0 (108)
58.3 (165)
58.0 (164)
90.0 (108)
57.6 (163)
individualized education plans (IEPs) must include a description of the students present
levels of academic and functional abilities, including justification for why his or her
disability affects participation and progress in the general curriculum (Wright & Wright,
2009). This tenet of special education law fits with the literature defining problem
validation as the comparison of student behavior with a standard of appropriate behavior
based on the performance of peers, building or district norms, and/or teacher expectations
(Upah, 2008). Because the majority of respondents indicated that they work with general
and special education students, problem validation may be an activity that is
systematically engaged in when working with all students.
Furthermore, special education law requires that IEPs include measurable
academic and functional goals, a description of how those goals will be measured, and
when progress towards those goals will be evaluated (Wright & Wright, 2009). IEPs
must be evaluated periodically (Wright & Wright, 2009, p. 104) to determine whether
the student is making progress towards meeting goals and to revise goals when this is
considered necessary. In addition, IEPs in some states also specify the location where
related services, such as counseling, will take place. These clauses may explain the
reported frequency of goal setting, the use of action verbs, and the specific problemsolving model components that respondents may incorporate into service delivery plans.
Before discussing least often used components of the problem-solving model, it
should be noted that, as the survey continued, the percentage of non-responses per item
increased. Several items that were used infrequently also had higher rates of nonresponse. Components used most infrequently were also those that would be used to
monitor and determine the effectiveness of counseling interventions (e.g., formative
148
149
Tilman, 2010), and single subject design (Horner et al., 2005) share some common
themes with the steps of the problem-solving model. The student is the unit of analysis,
and student behavior prior to intervention is compared to behavior during and after the
intervention is completed. One or more observable behaviors are operationally defined
and measured repeatedly, with some assessment of the consistency of measurement. The
target behavior should also have some social significance and relevance to the student.
The intervention must also be operationally defined, with specific and detailed
descriptions of what will take place and where the intervention will occur. Fidelity of
implementation is also important to maintain as the intervention is applied over time.
Student behavior is compared during baseline and intervention phases through regular,
on-going documentation, and then by visual analysis of all phases of behavior (Kazdin,
1982; Upah, 2008). The focus on the objective measurement of observable behaviors
over time, along with demonstrations of behavioral growth and intervention fidelity,
allow for data-based decision making and accountability for student outcomes.
While school psychologists are applying many aspects of the problem-solving
model with some consistency, the results of this study suggest that they may not be doing
so in such a way that would allow for easy demonstration of accountability. Although
respondents have indicated that they are behaviorally defining target behaviors, the
infrequent use of formative and summative evaluation could be an indication that school
psychologists may not be measuring these behaviors adequately enough to use these data
to determine whether the student has successfully responded to the intervention. The
frequent use of interviews and self-reports during baseline data collection calls into
question whether school psychologists are able to establish an observable, objective
150
baseline of student behavior before beginning an intervention. Interviews and selfreports are more global, non-standardized, and non-empirically validated measures of
data collection, and as such, may not provide reliable and valid indications of student
behavior. Furthermore, high rates of non-response for specific methods of baseline data
collection invites speculation on the adequacy of these data for comparison of student
behavior across different phases (baseline to intervention). Steps that would allow for the
comparison of baseline and intervention data (e.g., decision-making plans [70%] and
measurement strategy [78.9%]) were not consistently used by respondents in this sample.
Although respondents reported using direct behavioral observation to progress
monitor their interventions, it is unclear whether these observations are being conducted
during sessions, or in an environment where the problem behavior occurs. Third-party
behavior rating scales have support in SEB literature, however many are not sensitive
enough to be used repeatedly to measure change over the course of an intervention
(Volpe & Gadow, 2010). In addition, less empirically valid methods of progress
monitoring, such as interviews and self-reports, were used almost as often as behavioral
observations. Limited time and resources may prevent practitioners from collecting
sufficient data to make informed decisions about student behavior (Briesch, Chafouleas,
& Riley-Tilman, 2010), which may explain the infrequent use of formative and
summative evaluation.
While data-gathering methods such as anecdotal information, self-reports,
interviews, and practitioners subjective assessment of student behavior may provide
valuable information, research recommendations call for observable, objective, and
empirical assessment of student behavior gathered over time and presented for visual
151
analysis (Kazdin, 1982; Upah, 2008). The use of these non-observable and subjective
methods to establish baseline and progress monitor could compromise the value of using
an analysis of level and trend as formative and summative assessment tools. Similarly,
the majority of respondents in this study reported discontinuing students because
counseling goals had been met; however, the low frequency with which students are
discontinued each year might also be related to the adequacy of the data gathering
methods used by this sample.
The high rates of non-response for treatment integrity are another deviation from
recommendations in the literature. Nearly half the sample in this study did not respond to
items involving specific methods for measuring treatment integrity. This suggests that,
for this sample, treatment integrity is not something that they regularly engage in during
their counseling practices. While useful for planning, session logs, self-reports, and
permanent products of student work may not provide direct evidence that intervention
components have been implemented consistently and accurately over time the way that
session checklists and direct observations might.
In summary, it would appear that the respondents in this sample are using many
available tools to define and determine target behaviors. Many of their reported
counseling practices appear to conform to legislative guidelines for working with special
education students. These school psychologists indicated that they apply many of the
steps of the problem-solving model in their counseling practice, especially when defining
target behaviors and planning interventions. These results, however, also call into
question the degree to which these practitioners engage in progress monitoring and databased decision making, as the quality and frequency of baseline and progress monitoring
152
data collection may not enable documentation and comparison of student behavior to
determine whether behavioral improvement has been made (formative and summative
evaluation).
Implications for School Psychology
Although the conclusions of the current study are tentative at this point in time,
the results suggest that school psychologists might improve their ability to make databased decisions and demonstrate accountability for student outcomes. Specific areas for
growth for current practitioners and training programs include gathering more observable
and objective measurements of behavior over time to clearly demonstrate behavioral
change across baseline and intervention phases, corresponding improvements in
formative and summative assessment, and more consistent demonstration and evaluation
of treatment integrity.
Current research on the roles and responsibilities of school psychologists should
be considered in the discussion on possible barriers to improvements in data gathering.
As mentioned in Chapter 2, the four roles of the school psychologist include assessment,
direct intervention, consultation, and systems-level intervention (Fagan, 2008; NASP,
2010). School psychologists have expressed a preference for spending less time on
assessment, and more time and resources on interventions and consultation (Hosp &
Reschly, 2002; Reschly & Wilson, 1995), with a specific focus on student mental health
needs (Agresta, 2004). Briesch, Chafouleas, and Riley-Tilman (2010) cited limited time
and resources as factors that prevent practitioners from collecting enough data to make
informed decisions about student behavior. Given the connection between emotional
well-being and successful learning experiences (Haertel, Walberg, & Weinstein, 1983;
153
Wang, Haertel, & Walberg, 1990) and the focus on accountability for student outcomes
(Wright & Wright, 2009), researchers, practitioners, and professional organizations
should continue to advocate for role expansion and re-allocation of the responsibilities of
school psychologists. A paradigm shift, whereby school psychologists translate
behavioral research into practice and become active problem-solvers within their school
buildings, is necessary to demonstrate accountability for student outcomes and
achievement.
To increase accountability, school psychologists may need to re-allocate the
amount of time they spend engaged in research, consultation, and systems level activities,
in addition to spending more time on direct interventions. Behavioral research and best
practice guidelines exist specifying how to most effectively gather and evaluate objective,
observable, and repeated measures of student behavior (Briesch, Chafouleas, & RileyTilman, 2010; Upah, 2008; Volpe & Gadow, 2010). Current and historical time
allocation data suggest that consuming and applying this information is not possible
unless school psychologists are able to prioritize research as a professional function. At
the systems level, school psychologists may need to advocate for gathering more precise
behavioral data at a higher frequency than what is currently done in many school
buildings. Although questions remain regarding who collects behavioral data and in what
environment, it would appear as though consensus needs to be reached on how data will
be collected, by whom, and at what frequency. This may involve school psychologists
collecting data themselves, but may also entail other school professionals who are in a
better position to see students display problematic behaviors collecting data. If other
school professionals collect data, then school psychologists may need to have more time
154
available for consultation in order to provide their colleagues with necessary tools and
training, to measure fidelity to established protocols, and to increase buy-in, if this
becomes a factor.
Issues related to data gathering, decision-making, and accountability should be
addressed by school psychology training programs, as well as organizations providing
continuing education and professional development. These issues highlight a necessary
paradigm shift where research is integrated into practice, with new and experienced
practitioners receiving training to become active problem-solvers. The results of this
study would suggest that the focus should be on developing and disseminating knowledge
and supervised experience gathering repeated, objective measurements of student
behavior to establish baseline, document progress during intervention, and to design and
evaluate formative and summative assessment. Building in measurement of treatment
integrity should also be an essential component of training and continuing education.
Experiences with these skills should be connected to coursework, practice, and
professional development with direct and indirect interventions, to ensure that
practitioners know how to implement standards of data-based decision making and
accountability for outcomes within the settings in which they are employed.
Limitations and Directions for Future Research
The results of this survey provide a broad overview of the implementation of
counseling interventions based on the problem-solving model, a topic that had not
previously been addressed in the literature. In the process of evaluating the results of this
study, it is necessary to address the limitations of this study, particularly with respect to
the reliability and validity of this survey in accurately capturing information on
155
156
2005). The length of this survey was a concern from the beginning, and therefore,
question selection was limited to those items considered most likely to answer the
research questions. As such, items pertaining to specific components of problem analysis
and validation were omitted. When analyzing the results of this study, however, it would
be beneficial to have information regarding the data used to establish and learn about
behaviors of concern, how these data are gathered, and to what these data are compared.
Future research exploring data-based decision making and accountability should focus on
fewer components of the problem-solving model (e.g., baseline and progress monitoring
data collection, and/or formative and summative assessment) in greater depth, now that
preliminary research has examined these topics with a broad lens. Shorter measures may
also increase the response rate and provide a more valid picture of counseling practices
implemented in schools today.
Additional limitations for this survey match several of those listed in Chapter 2
for self-administered surveys. For example, once respondents begin completing the
survey, it cannot be altered, and some respondents may struggle to align their own
experiences with those choices presented to them on a survey (Barribeau et al., 2005). In
the case of this survey, after implementation, it was discovered that the treatment
integrity item allowed respondents to select whether they sometimes or always used
this component. All other items pertaining to the use of general components of the
problem-solving model forced respondents to state whether they did or did not employ
each component. When comparing the usage of general and specific components of the
problem-solving model with the rates of non-response, it would appear as though, in
some cases, respondents indicated that they engaged in a general component without
157
being able to describe their practice by selecting any of the corresponding specific
components. Large increases in the rate of non-response between general and specific
components for steps such as treatment integrity, and summative and formative
evaluation would suggest that, on some items, there might have been a discrepancy
between the intent of the researcher and the understanding of the respondent.
To address these limitations, future surveys should include uniform response
choices, and data on counseling might be obtained using a mixed method research design.
Focus groups comprised of school psychologists who spend varying amounts of time on
assessment and direct interventions could provide insight on their counseling practices to
determine the level of agreement between what is being done and what is specified in
research and best practices. An important question to ask relates to barriers and
facilitators practitioners face, especially when gathering empirical and objective
behavioral data. Researchers could also examine de-identified examples of
documentation related to counseling practices, such as intervention, decision-making, and
measurement plans, evidence of treatment integrity, and aggregated student data used for
visual analysis as part of formative and summative evaluation. Information gathered
from these data would allow school psychologists as a field to evaluate their level of
accountability, while providing guidance for researchers, professional organizations,
practitioners, and training programs.
Analysis of the results of this study also produced additional questions related to
where baseline and progress monitoring data gathering occurs. Respondents reported
using direct behavioral observation to gather baseline and progress monitoring data.
These items, however, did not specify where such behavioral observations occur.
158
159
160
161
References
Agresta, J. (2004). Professional role perceptions of school social workers, psychologists,
and counselors. Children and Schools, 26,151-163.
Ahrens, J., & Rexford, L. (2002). Cognitive processing therapy for incarcerated
adolescents with PTSD. Journal of Aggression, Maltreatment & Trauma, 6, 201216. doi:10.1300/J146v06n01_10
Alexander, L. (1986). Summary of Time for Results, report on education by governors
group. Chronicle of Higher Education, 33(1), 78-79.
Aldred, C., Green, J., & Adams, C. (2004). A new social communication intervention for
children with autism: Pilot randomized controlled treatment study suggesting
effectiveness. Journal of Child Psychology and Psychiatry, 45, 1420-1430.
doi:10.1111/j.1469-7610.2004.00338.x
Alreck, P.L., & Settle, R.B. (2004). The survey research handbook (3rd ed.). New York:
McGraw-Hill/Irwin.
American Counseling Association. (2011). Resources. Retrieved from
http://www.counseling.org/Resources/
American Psychiatric Association. (2000). Diagnostic and statistical manual of mental
disorders (4thed.). Washington, DC: Author.
American Psychological Association (APA). (2005). Policy statement on evidence-based
practice in psychology. Retrieved from http://www.mspp.net/APA%20policy%20
On%20EBT.htm
American Psychological Association (APA). (2010). Ethical principles of psychologists
and code of conduct. Retrieved from http://www.apa.org/ethics/code/index.aspx#
162
163
Baer, D. M., Wolf, M. M., & Risley, T. R. (1987). Some still-current dimensions of
applied behavior analysis. Journal of Applied Behavior Analysis, 20, 313-327.
doi:10.1901/jaba.1987.20-313
Bandura, A. (1977). Social learning theory. Englewood Cliffs, NJ: Prentice-Hall.
Bandura, A. (2004). Models of causality in social learning theory. In A. Freeman,
M.J. Mahoney, P. Devito, & D. Martin (Eds.) Cognition and psychotherapy (2nd
ed., pp. 25-44). New York: Springer.
Barkley, R.A., Shelton, T.L., Crosswait, C., Moorehouse, M., Fletcher, K., Barrett, S.
(2000). Multimethod psychoeducational intervention for preschool children
with disruptive behavior: Preliminary results at post-treatment. Journal of Child
Psychology and Psychiatry and Allied Disciplines, 41, 319-332. doi:10.1111/
1469-7610.00616
Barrett, P. M. (1998). Evaluation of cognitive-behavioral group treatments for childhood
anxiety disorders. Journal of Clinical Child Psychology, 27, 459468.
doi:10.1207/s15374424jccp2704_10
Barrett, P. M., Dadds, M. R., & Rapee, R. M. (1996). Family treatment of childhood
anxiety: A controlled trial. Journal of Consulting and Clinical Psychology, 64,
333342. doi:10.1037/0022-006X.64.2.333
Barrett, P. M., Farrell, L., Pina, A. A., Peris, T. S., & Piacentini, J. (2008). Evidencebased psychosocial treatments for child and adolescent obsessive-compulsive
disorder. Journal of Clinical Child & Adolescent Psychology, 37, 131-155.
doi: 10.1080/15374410701817956
Barrett, P.M., Healy-Farrell, L.J., & March, J.S. (2004). Cognitive-behavioral family
164
165
Bernal, M.E., Klinnert, M.D., & Schultz, L.A. (1980). Outcome evaluation of behavioral
parent training and client-centered parent counseling for children with conduct
problems. Journal of Applied Behavior Analysis, 13, 677-691. doi:10.1901/jaba.
1980.13-677
Beyer, H. (1989). Education for All Handicapped Children Act: 1975-1989. A judicial
history. Exceptional Parent, 19(6), 52-58.
Block, J. (1978). Effects of a rational-emotive mental health program on poorly
achieving, disruptive high school students. Journal of Counseling Psychology,
25, 61-65. doi:10.1037/0022-0167.25.1.61
Bogels, S.M., & Siqueland, L. (2006). Family cognitive behavioral therapy for children
and adolescents with clinical anxiety disorders. Journal of the American Academy
of Child and Adolescent Psychiatry, 45, 134-141. doi:10.1097/01.chi.
0000190467.01072.ee
Bor, W., Sanders, M.R., & Markie-Dadds, C. (2002). The effects of the Triple-P
Positive Parenting Program on preschool children with co-occurring disruptive
behavior and attentional/hyperactive difficulties. Journal of Abnormal Child
Psychology, 30, 571-587. doi:10.1023/A:1020807613155
Borduin, C., Mann, B.J., Cone, L.T., Henggeler, S.W., Fucci, B.R. , Blaske, D.B.,
(1995). Multisystemic treatment of serious juvenile offenders: Long-term
prevention of criminality and violence. Journal of Consulting and Clinical
Psychology, 63, 569-578. doi:10.1037/0022-006X.63.4.569
Bramlett, R. K., Murphy, J. J., Johnson, J., Wallingsford, L., & Hall, J. D. (2002).
Contemporary practices in school psychology: A national survey of roles and
166
167
168
family correlates, and gender. Journal of Family Issues, 27, 587-608. doi:10.1177/
0192513X05285187
Chemtob, C., Nakashima, J., & Carlson, J. (2002). Brief treatment for elementary school
children with disaster-related posttraumatic stress disorder: A field study. Journal
Of Clinical Psychology, 58, 99-112. doi:10.1177/0192513X05285187
Christian, L.M., & Dillman, D.A. (2004). The influence of symbolic and graphical
language manipulations on answers to paper self-administered questionnaires.
Public Opinion Quarterly, 68, 57-80. doi:10.1093/poq/nfh004
Christensen, A., Johnson, S.M., Phillips, S., & Glasgow, R.E. (1980). Cost effectiveness
in behavioral family therapy. Behavior Therapy, 11, 208-226. doi:10.1016/S00057894(80)80021-9
Clarke, G.N., Hawkins, W., Murphy, M., Sheeber, L., Lewinsohn, P.M., & Seeley, J.
(1995). Targeted prevention of unipolar depressive disorder in an at risk sample
of high school adolescents: A randomized trial of a group cognitive intervention.
American Academy of Child and Adolescent Psychiatry, 34, 312-321. doi:10.1097
/00004583-199503000-00016
Clarke, G.N., Hornbrook, M., Lynch, F., Polen, M., Gale, J., Beardslee, W.R.
(2001). A randomized trial of a group cognitive intervention for preventing
depression in adolescent offspring of depressed parents. Archives of General
Psychiatry, 58, 1127-1134. doi:10.1001/archpsyc.58.12.1127
Clarke, G.N., Rohde, P., Lewinsohn, P.M., Hops, H., & Seeley, J. (1999). Cognitive
behavioral treatment of adolescent depression: Efficacy of acute group treatment
and booster session. Journal of the American Academy of Child and Adolescent
169
170
Cornwall, E., Spence, S.H., &Schotte, D. (1996). The effectiveness of emotive imagery
in the treatment of darkness phobia in children. Behaviour Change, 13, 223-229.
Counseling. (n.d.). In Merriam-Websters online dictionary. Retrieved from
http://mw4.merriam-webster.com/dictionary/counseling?show=0&t=1296248084
Couper, M.P., Kapteyn, A., Schonlau, M., & Winter, J. (2007). Noncoverage and
nonreponse in an internet survey. Social Science Research, 36, 131-148. doi:10.
1016/j.ssresearch.2005.10.002
Crespi, T. D. (2009). Group counseling in the schools: Legal, ethical, and treatment
issues in school practice. Psychology in the Schools, 46, 273-280. doi:10.1002
/pits.20373
Crespi, T.D., &Fischetti, B.A. (1997, September). Counseling and psychotherapy in the
schools: Rationale and considerations for professional practice. NASP
Communiqu, 26, 18-20.
Crespi, T.D., & Howe, E.A. (2002). Families in crisis: Considerations for special service
providers in the school. Special Services in the Schools, 18, 43-54. doi:10.1300
/J008v18n01_03
Crespi, T.D., &Rigazio-DiGilio, S.A. (1996). Adolescent homicide and family
pathology: Implications for research and treatment with adolescents. Adolescence,
31, 353-367.
Crespino, J. (2006). The best defense is a good offense: The Stennis Amendment and the
fracturing of liberal school desegregation policy, 1964-1972. Journal of Policy
History, 18, 304-325. doi:10.1353/jph.2006.0008
Curtis, M. J., Grier, E. J. C., & Hunley, S. A. (2004). The changing face of school
171
psychology: Trends in data and projections for the future. School Psychology
Review, 28, 104-116.
Curtis, M. J., Hunley, S. A., & Grier, E. C. (2004). The status of school psychology:
Implications of a major personnel shortage. Psychology in the Schools, 41, 431442. doi: 10.1002/pits.10186
Curtis, M.J., Lopez, A.D., Castillo, J.M., Batsche, G.M., Minch, D., & Smith, J.C.
(2008). The status of school psychology: Demographic characteristics,
employment conditions, professional practices, and continuing professional
development. Communiqu, 36, 27-29.
Curtis, M. J., Lopez, A. D., Batsche, G. M., & Smith, J. C. (2006, March). School
psychology 2005: A national perspective. Paper presented at the annual meeting
of the National Association of School Psychologists, Anaheim, CA.
Cutts, N. E. (1955). School psychologists at mid-century. Washington, DC: American
Psychological Association.
David-Ferdon, C., & Kaslow, N. (2008). Evidence-based psychosocial treatments for
child and adolescent depression. Journal of Clinical Child & Adolescent
Psychology, 37, 62-104. doi: 10.1080/15374410701817865
Deblinger, E., Lippman, J., & Steer, R. (1996). Sexually abused children suffering
posttraumatic stress symptoms: Initial treatment outcome findings. Child
Maltreatment, 1, 310-321.
Deblinger, E., Stauffer, L., & Steer, R. (2001). Comparative efficacies of supportive and
cognitive behavioral group therapies for young children who have been sexually
abused and their nonoffending mothers. Child Maltreatment, 6, 332-343.
172
Dennis, M., Godley, S.H., Diamond, G., Tims, F.M., Babor, T., Donaldson, J. (2004).
The Cannabis Youth Treatment (CYT) Study: Main findings from two
randomized trials. Journal of Substance Abuse Treatment, 27, 197-213.
Deno, S. L. (1995). School psychologist as problem solver. In A. Thomas & J. Grimes
(Eds.), Best practices in school psychology (3rd ed., pp. 471-484). Washington,
DC: National Association of School Psychologists.
Dillman, D.A. (2007). Recent developments in the design of web, mail, and mixed-mode
surveys. In Mail and internet surveys: The Tailored Design method (pp. 447-503).
Hoboken, NJ: John Wiley & Sons.
Dillman, D.A., & Redline, C.D. (2004). Testing paper self-administered questionnaires:
Cognitive interview and field test comparisons. In S. Presser et al. (Eds.),
Methods for testing and evaluating survey questionnaires (pp. 299-317). New
York: Wiley-Interscience.
Doll, B. (1996). Prevalence of psychiatric disorders in children and youth: An agenda for
advocacy by school psychology. School Psychology Quarterly, 11, 20-46.
Doll, B., & Cummings, J. A. (2008). Best practices in population-based school mental
health services. In A. Thomas & J. Grimes (Eds.), Best Practices in School
Psychology (5th ed., pp. 1333-1347). Bethesda, MD: National Association of
School Psychologists.
Drew, A., Baird, G., Baron-Cohen, S., Cox, A., Slomins, V., Wheelwright, S. (2002).
A pilot randomized control trial of a parent training intervention for pre-school
children with autism: Preliminary findings and methodological challenges.
European Child and Adolescent Psychiatry, 11, 266-272.
173
Eikeseth, S., Smith, T., Jahr, E., Eldevik, S. (2002). Intensive behavioral treatment at
school for 4- to 7-year-old children with autism: A 1-year comparison controlled
study. Behavioral Modification, 26, 49-68.
Elementary and Secondary Education Act of 1965, Pub. L. No. 89-10 Stat. 79 (1965).
Retrieved from http://nysl.nysed.gov/Archimages/91338.PDF
Elementary and Secondary Education Amendments of 1967, Pub. L No. 90-247, Stat. 81
(1968). Retrieved from http://nysl.nysed.gov/Archimages/91341.PDF
Elementary and Secondary Education Act Amendments of 1974, Pub. L. No. 93-380 Stat.
88. Retrieved from http://nysl.nysed.gov/Archimages/91344.PDF
Equal Education Opportunity Act of 1974, 20 U.S.C. 1701, 1702. Retrieved from
http://codes.lp.findlaw.com/uscode/20/39/I/1/1702
Evans, J.R., & Mathur, A. (2005). The value of online surveys. Internet Research, 13,
195-219.
Eyberg, S. M., Nelson, M. M., & Boggs, S. R. (2008). Evidence-based psychosocial
treatments for children and adolescents with disruptive behavior. Journal of
Clinical Child & Adolescent Psychology, 37, 215-237. doi: 10.1080/
15374410701820117
Fagan, T. K. (1986). The historical origins and growth of programs to prepare school
psychologists in the United States. Journal of School Psychology, 24, 9-22.
Fagan, T. K. (1988). The historical improvement of the school psychology service ratio:
Implications for future employment. School Psychology Review, 17, 447-458.
Fagan, T. K. (1992). Compulsory schooling, child study, clinical psychology, and special
education: Origins of school psychology. American Psychologist, 47(2), 236-243.
174
Fagan, T. K. (2008). Trends in the history of school psychology in the United States. In
A. Thomas & J. Grimes (Eds.), Best Practices in School Psychology (5th ed., pp.
2069-2085). Bethesda, MD: National Association of School Psychologists.
Fantuzzo, J., Manz, P., Atkins, M., & Meyers, R. (2005). Peer-mediated treatment of
socially withdrawn maltreated preschool children: Cultivating natural community
resources. Journal of Clinical Child & Adolescent Psychology, 34, 320-325.
Fantuzzo, J., Sutton-Smith, B., Atkins, M., Meyers, R., Stevenson, H., Coolahan, K.
(1996). Community-based resilient peer treatment of withdrawn maltreated
preschool children. Journal of Consulting and Clinical Psychology, 64, 13771386.
Farmer, E.M., Burns, B.J., Philip, S.D., Angold, A., & Costello, E.J. (2003). Pathways
into and through mental health services for children and adolescents. Psychiatric
Services, 54, 60-67. doi:10.1176/appi.ps.54.1.60
Farrell, A.D., Guerra, N.G., &Tolan, P.H. (1996). Preventing aggression in inner city
children: Small group training to change cognitions, social skills, and behavior.
Journal of Child and Adolescent Group Therapy, 4, 229-242.
Fergusson, D.M., Horwood, L.L., &Lynskey, M. (1994). The childhoods of multipleproblem adolescents: A 15-year longitudinal study. Journal of Child Psychology
and Psychiatry, 35, 1123-1140. doi:10.1111/j.1469-7610.1994.tb01813.x
Feindler, E.L., Marriot, S.A., & Iwata, M. (1984). Group anger control training for junior
high school delinquents. Cognitive Therapy and Research, 8, 299-311. doi:10.
1007/BF01173000
Fisher, G. L., Jenkins, S. J., & Crumbley, J. D. (1986). A replication of a survey of school
175
176
177
http://www2.ed.gov/legislation/GOALS2000/TheAct/intro.html
Goldwasser, E., Meyers, J., Christenson, S., & Graden, J. (1983). The impact of
PL 94-142 on the practice of school psychology: A national survey. Psychology in
the Schools, 20, 153-165. doi:10.1002/1520-6807(198304)20:2<153::AIDPITS2310200206>3.0.CO;2-W
Goldstein, T.R., Axelson, D.A., Birmaher, B., & Brent, D.A. (2007). Dialectical behavior
therapy for adolescents with bipolar disorder: A one-year open trial. Journal of
the American Academy of Child and Adolescent Psychiatry, 46, 820-830. doi:
10.1097/chi.0b013e31805c1613
Granello, D.H., & Wheaton, J.E. (2004). Online data collection: Strategies for research.
Journal of Counseling & Development, 82, 387-393.
Grant, W. V., & Eiden, L. J. (1980). Digest of educational statistics 1980. Washington,
DC: U. S. Government Printing Office.
Gray, P. J., Caulley, D. N., & Smith, N. L. (1982). A study in contrasts: Effects of the
Education Consolidation and Improvement Act of 1981 on SEA and LEA
evaluation (ERIC No ED 235 183). Northwest Regional Educational Lab.
Retrieved from http://www.eric.ed.gov/PDFS/ED235183.pdf
Gresham, F.M. (1989). Assessment of treatment integrity in school consultation and
prereferral intervention. School Psychology Review, 17, 211-226.
Gresham, F.M. (2002). Responsiveness to intervention: An alternative approach to the
identification of learning disabilities. In R. Bradley, L. Danielson, & D. Hallahan
(Eds.), Learning disabilities: Research to practice (pp. 467-519). Mahwah, NJ:
Lawrence Erlbaum.
178
179
Hartshorne, T. S., & Johnson, M. C. (1985). The actual and preferred roles of the school
psychologist according to secondary school administrators. Journal of School
Psychology, 23, 241-246. doi:10.1016/0022-4405(85)90015-9
Hawkins, R. P., & Dobes, R. W. (1977). Behavioral definitions in applied behavior
analysis: Explicit or implicit. In B. C. Etzel, J. M. LeBlanc, & D. M. Baer (Eds.),
New developments in behavioral research: Theory, methods, and applications.
Hillsdale, NJ: Erlbaum.
Hayes, S.C., Barlow, D.H., & Nelson-Gray, R. O. (1999). The scientist-practitioner:
Research and accountability in the age of managed care. Boston: Allyn & Bacon.
Hayward, C., Varady, S., Albano, A.M., Thienamann, M., Henderson, L., & Schatzberg,
A.F. (2000). Cognitive-behavioral group therapy for social phobia in female
adolescents: Results of a pilot study. Journal of the American Academy of Child
and Adolescent Psychiatry, 39, 721-726. doi:10.1097/00004583-20000600000010
Heartland Area Education Agency. (2007). Improving childrens educational results
through data-based decision-making. Retrieved from http://www.aea11.k12.ia.us
/spedresources/ModuleFour.pdf
Henggeler, S.W., Clingempeel, W.G., Brondino, M.J., & Pickrel, S.G. (2002). Four-year
follow-up of multisystemic therapy with substance-abusing and substancedependent juvenile offenders. Journal of the American Academy of Child and
adolescent Psychiatry, 41, 868-874. doi:10.1097/00004583-200207000-00021
Henggeler, S.W., Melton, G.B., Brondino, M.J., Scherer, D.G., & Hanley, J.H. (1997).
Multisystemic therapy with violent and chronic juvenile offenders and their
180
181
Hoagwood, K., & Erwin, H. (1997). Effectiveness of school-based mental health services
for children: A 10 year research review. Journal of Child and Family Studies, 6,
435-451. doi:10.1023/A:1025045412689
Hoagwood, K., Hibbs, E., Brent, D., & Jensen, P. (1995). Introduction to the special
section: Efficacy and effectiveness in studies of child and adolescent
psychotherapy. Journal of Consulting and Clinical Psychology, 63, 683-687.
doi:10.1037//0022-006X.63.5.683
Hoagwood, K., & Johnson, J. (2003). School psychology: A public health framework: I.
From evidence-based practices to evidence-based policies. Journal of School
Psychology, 41, 3-21. doi:10.1016/S0022-4405(02)00141-3
Hoath, F.E., & Sanders, M.R. (2002). A feasibility study of enhanced group Triple PPositive Parenting Program for parents of children with attention-deficit/
hyperactivity disorder. Behavior Change, 19, 191-206. doi:10.1375/bech.19.4.191
Hollingworth, L. S. (1922, May). Existing laws which authorize psychologists to perform
professional services. Journal of Criminal Law and Criminology, 12, 70-73.
Hollingworth, L. S. (1932, February). Psychological service for public schools. Address
delivered to the Child Study Club of Springfield, MA.
Hombeck, G. (1997). Toward terminological, conceptual, and statistical clarity in the
study of mediators and moderators: Examples from the child-clinical and pediatric
psychology literatures. Journal of Consulting and Clinical Psychology, 65, 599610.
Hops, H., Waldron, H.B., Davis, B., Barrera, M., Turner, C.W., Brody, J., &
Ozechowski, T.J. (2007). Ethnic influences on family processes and family
182
183
184
185
186
187
188
189
Lewinsohn, P.M., Clarke, G., Hops, H., & Andrews, J. (1990). Cognitive-behavioral
treatment for depressed adolescents. Behavior Therapy, 21, 385-401. doi:10.
1016/S0005-7894(05)80353-3
Lewinsohn, P.M., Clarke, G., Rohde, P., Hops, H., & Seeley, J. (1996). A course in
coping: A cognitive-behavioral approach to the treatment of adolescent
depression. In E.D. Hibbs & P.S. Jensen (Eds.), Psychosocial treatments for child
and adolescent disorders: Empirically based strategies for clinical practice (pp.
109-135). Washington, DC: American Psychological Association.
Lichtenstein, R. (2008). Best practices in identification of learning disabilities. In A.
Thomas & J. Grimes (Eds.), Best practices in school psychology (5th ed., pp.
1661-1672). Bethesda, MD: National Association of School Psychologists.
Liddle, H.A., Dakof, G.A., Diamond, G.S., Parker, G.S., Barrett, K., & Tejeda, M.
(2001). Multidimensional family therapy for adolescent substance abuse: Results
of a randomized clinical trial. American Journal of Drug and Alcohol Abuse, 27,
651-687. doi:10.1081/ADA-100107661
Liddle, H.A., Rowe, C.L., Dakof, G.A., Ungaro, R.A., & Henderson, C.E. (2004). Early
intervention for adolescent substance abuse: Pretreatment to posttreatment
outcomes of a randomized clinical trial comparing multidimensional family
therapy and peer group treatment. Journal of Psychoactive Drugs, 36,49-63.
Lieberman, A.F., Van Horn, P., & Ippen, C.G. (2005). Toward evidence-based treatment:
Child-parent psychotherapy with preschoolers exposed to marital violence.
Journal of the American Academy of Child & Adolescent Psychiatry, 44, 12411248. doi:10.1097/01.chi.0000181047.59702.58
190
Lochman, J.E., Coie, J.D., Underwood, M.K., & Terry, R. (1993). Effectiveness of a
social relations intervention program for aggressive and nonaggressive, rejected
children. Journal of Consulting and Clinical Psychology, 61, 1053-1058. doi:10.
1037/0022-006X.61.6.1053
Lonigan, C. J., Elbert, J. C., & Johnson, S. B. (1998). Empirically supported psychosocial
interventions for children: An overview. Journal of Clinical Child Psychology,
27, 138-145. doi:10.1207/s15374424jccp2702_1
Lovaas, O.I. (1987). Behavioral treatment and normal educational and intellectual
functioning in young autistic children. Journal of Consulting and Clinical
Psychology, 55, 3-9. doi:10.1037/0022-006X.55.1.3
Mangione, T. W. (1995). Mail surveys: Improving the quality. Applied Social Research
Methods Services, Vol. 40. Thousand Oaks, CA: Sage Publications.
Meacham, M., & Peckham, P. (1978). School psychologists at three-quarters century:
Congruence between training, practice, preferred role and competence. Journal of
School Psychology, 16, 195-206. doi:10.1016/0022-4405(78)90001-8
Mendlowitz, S. L., Manassis, K., Bradley, S., Scapillato, D., Miezitis, S., & Shaw, B. F.
(1999). Cognitive-behavioral group treatments in childhood anxiety disorders:
The role of parental involvement. Journal of the American Academy of Child and
Adolescent Psychiatry, 38, 12231229. doi:10.1097/00004583-199910000-00010
Melvin, G.A., Tonge, B.J., King, N.J., Heyne, D., Gordon, M.S., & Klimkeit, E. (2006).
A comparison of cognitive-behavioral therapy, sertraline, and their combination
for adolescent depression. Journal of the American Academy of Child and
Adolescent Psychiatry, 45, 1151-1161. doi:10.1097/01.chi.0000233157.21925.71
191
Merrell, K.W. (2008). Behavioral, social, and emotional assessment of children and
adolescents (3rd ed.). New York: Erlbaum/Taylor & Francis.
Merrell, K.W. (2010). Better methods, better solutions: Developments in school-based
behavioral assessment. School Psychology Review, 39(3), 422-426.
Merrell, K. W., Ervin, R. A., & Gimpel, G. A. (2006). Legal and ethical issues in school
psychology. In School psychology for the 21st century (pp. 113-138). New York:
Guilford.
Merrell, K. W., Ervin, R. A., & Gimpel, G. A. (2006). Working as a school psychologist.
In School psychology for the 21st century (pp. 94-112). New York: Guildford.
Miklowitz, D.J., Axelson, D.A., Birmaher, B., George, E.L., Taylor, D.O., Schneck,
C.D., Brent, D.A. (2008). Family-focused treatment for adolescents with
bipolar disorder: Results of a two-year randomized trial. Archives of General
Psychiatry, 65, 1053-1061. doi:10.1001/archpsyc.65.9.1053
Mills, C., Stephan, S.H., Moore, E., Weist, M.D., Daly, B.P., & Edwards, M. (2006).
The presidents New Freedom Commission: Capitalizing on opportunities to
advance school-based mental health services. Clinical Child and Family
Psychology Review, 9, 146-161. doi:10.1007/s10567-006-0003-3
Mills v. Board of Education of District of Columbia, 348 F. Supp. 866 (1972); Contempt
proceedings, 551 Educ. of the Handicapp L. Rep. 643 (D. D.C. 1980).
Miltenberger, R.G. (2004). Behavior modification: Principles and procedures (3rd ed.).
Pacific Grove, CA: Wadsworth.
Miltenberger, R.G. (2005). Strategies for measuring behavioral change. In L.M.
Bambara & L. Kern (Eds.), Individualized supports for students with problem
192
193
Muris, P., Merkelbach, H., Holdrinet, I., & Sijsenaar, M. (1998). Treating phobic
children: Effects of EMDR versus exposure. Journal of Consulting and Clinical
Psychology, 66, 193-198. doi:10.1037/0022-006X.66.1.193
Murphy, J. J. (2008). Best practices in conducting brief counseling with students. In A.
Thomas & J. Grimes (Eds.), Best Practices in School Psychology (5th ed., pp.
1439-1455). Bethesda, MD: National Association of School Psychologists.
National Association of School Psychologists (NASP). (2000). NASP standards for
training and field placement programs in school psychology. Retrieved from
http://www.nasponline.org/standards/FinalStandards.pdf
National Association of School Psychologists (NASP). (2006). Social/emotional
development: School-based mental health services and school psychologists.
Retrieved from http://www.nasponline.org/about_nasp/pospaper_iac.aspx
National Association of School Psychologists (NASP). (2010). Model for comprehensive
and integrated school psychological services. Retrieved from
www.nasponline.org/standards/2010standards/2_PracticeModel.pdf
National Association of Social Workers . (2008). Code of ethics of the National
Association of Social Workers. Retrieved from http://www.naswdc.org/pubs/
code/code.asp
National Center for Children and Youths with Disabilities (1998). The IDEA
amendments of 1997. New Digest, 26, 1-40.
National Center on Response to Intervention. (2011). 2011 Screening Call for
Submissions of RTI Screening Tools. Retrieved from http://www.rti4success.org
/resourceslanding
194
195
Newton, J.S., Horner, R.H., Algozzine, R.F., Todd, A.W., & Algozzine, K.M. (2009).
Using a problem-solving model to enhance data-based decision making in
schools. In W. Sailor, G. Dunlap, G. Sugai, & R.H. Horner (Eds.), Handbook of
positive behavior support (pp. 551-580). New York: Springer.
Nielsen Company. (2008, December). An overview of home internet access in the U.S.
Retrieved from http://blog.nielsen.com/nielsenwire/wp-content/uploads
/2009/03/overview-of-home-internet-access-in-the-us-jan-6.pdf
Nixon, R.D., Sweeney, L., Erickson, D.B., & Touyz, S.W. (2003). Parent-child
interaction therapy: A comparison of standard and abbreviated treatments for
oppositional defiant preschoolers. Journal of Consulting and Clinical
Psychology, 71, 251-260. doi:10.1037/0022-006X.71.2.251
No Child Left Behind Act of 2001, Pub. L. No. 107-110, 115 Stat. 1425 (2002). Accessed
from http://www2.ed.gov/policy/elsec/leg/esea02/107-110.pdf
Ollendick, T. H., & King, N. J. (2004). Empirically supported treatments for children and
adolescents: Advances toward evidence-based practice. In P. M. Barrett & T. H.
Ollendick (Eds.) Handbook of interventions that work with children and
adolescents: Prevention and treatment (pp. 3-25). New York: John Wiley &
Sons.
Osher, D., Dwyer, K., & Jackson, S. (2004). Safe, supportive, and successful schools:
Step by step. Longmont, CO: Sopris West.
Ost, L., Svensson, L., Hellstrom, K., & Lindwall, R. (2001). One-session treatment of
specific phobia in youth: A randomized clinical trial. Journal of Consulting and
Clinical Psychology, 69, 814-824. doi:10.1037/0022-006X.69.5.814
196
Patterson, G.R., Reid, J.B., Jones, R.R., & Conger, R.E. (1975). A social learning
approach to family intervention: Families with aggressive children (Vol. 1).
Eugene, OR: Castalia.
Pealer, L., & Weiler, R.M. (2003). Guidelines for designing a web-delivered college
health risk behavior survey: Lessons learned from the University of Florida
behavior survey. Health Promotion Practice, 4, 171-179. doi:10.1177/
1524839902250772
Pediatric OCD Treatment Study Team. (2004). Cognitive-behavior therapy, sertraline,
and their combination for children and adolescents with obsessive-compulsive
disorder. Journal of the American Medical Association, 292, 1969-1976.
Peed, S., Roberts, M., & Forehand, R. (1977). Evaluation of the effectiveness of a
standardized parent training program in altering the interaction of mothers and
their noncompliant children. Behavior Modification, 1, 323-350. doi:10.1177
/014544557713003
Pelham, W. E., & Fabiano, G. A. (2008). Evidence-based psychosocial treatments for
attention-deficit/hyperactivity disorder. Journal of Clinical Child & Adolescent
Psychology, 37, 184-214. doi: 10.1080/15374410701818681
Pelham, W.E., Fabiano, G.A., & Massetti, G.M. (2005). Evidence-based assessment of
attention deficit hyperactivity disorder in children and adolescents. Journal of
Clinical Child and Adolescent Psychology, 34, 449-476. doi:10.1207/
s15374424jccp3403_5
Pelham, W.E., Gnagy, E.M., Greiner, A.R., Hoza, B., Hinshaw, S.P., Swanson, J.M.
(2000). Behavioral vs. behavioral and pharmacological treatment in ADHD
197
198
/cpb.2006.9.548
Rehabilitation Act of 1973, Pub. L. No. 93-112 (1973). Retrieved from
http://www.dotcr.ost.dot.gov/documents/ycr/REHABACT.HTM
Reisman, J.M. (1966). The development of clinical psychology. New York: AppletonCentury-Crofts.
Reisner, E. H. (1915). Evolution of the common school. New York: Macmillan.
Reschly, D., Tilly, W. D., & Grimes, J. (2000). Special education in transition:
Functional assessment and noncategorical programming. Longmont, CO: Sopris
West.
Reschly, D. J., & Connolly, L. M. (1990). Comparisons of school psychologists in the
city and country: Is there a rural school psychology? School Psychology
Review, 19, 534-549.
Reschly, D. J., & Wilson, M. S. (1995). School psychology practitioners and faculty:
1986 to 1991-92 trends in demographics, roles, satisfaction, and system reform.
School Psychology Review, 24, 62-80.
Reynolds, C. R., Gutkin, T., Elliot, S. N., & Witt, J. C. (1984). School psychology:
Essentials of theory and practice. New York: John Wiley.
Reynolds, W.M., & Coats, K. (1986). A comparison of cognitive-behavioral therapy and
relaxation training for the treatment of depression in adolescents. Journal of
Consulting and Clinical Psychology, 54, 653-660. doi:10.1037/0022006X.54.5.653
Ringel, J., & Sturm, R. (2001). National estimates of mental health utilization and
expenditure for children in 1998. Journal of Behavioral Health Services &
199
200
Russel, G.F.M., Szmukler, G.I., Dare, C., & Eisler, I. (1987). An evaluation of family
therapy in anorexia nervosa and bulimia nervosa. Archives of General
Psychiatry, 44, 1047-1056.
Sackett, D. L., Rosenberg, W. M., Gray, J. A., Haynes, R. B., & Richardson, W. S.
(1996). Evidence based medicine: What it is and what it isnt. British Medical
Journal, 312, 71-72.
Sanders, M.R., Markie-Dadds, C., Tully, L.A., & Bor, W. (2000). The Triple P- Positive
Parenting Program: A comparison of enhanced, standard, and self-directed
behavioral family intervention for parents of children with early onset conduct
problems. Journal of Consulting and Clinical Psychology, 68, 624-640. doi:10.
1037/0022-006X.68.4.624
Sandoval, J. (1993). The history of interventions in school psychology. Journal of School
Psychology, 31, 195-217.
Santisteban, D.A., Coatsworth, D.J., Perez-Vidal, A., Kurtines, W.M., Schwartz, S.J.,
LaPerriere, A. (2003). Efficacy of brief strategic family therapy in modifying
hispanic adolescent behavior problems and substance abuse. Journal of Family
Therapy, 17, 121-133.
Sax, L.J., Gilmartin, S.K., & Bryant, A.N. (2003). Assessing response rates and
nonresponse bias in web and paper surveys. Research in Higher Education, 44,
409-432. doi:10.1023/A:1024232915870
Schmidt, U., Lee, S., Beecham, J., Perkins, S., Treasure, J., Yi, I. (2007). A
randomized controlled trial of family therapy and cognitive behavioral therapy
guided self-care for adolescents with bulimia nervosa and related disorders.
201
202
Shinn, M.R. (2008). Best practices in curriculum-based measurement in a problemsolving model. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology (5th ed., pp. 243-261). Bethesda, MD: National Association of School
Psychologists.
Silverman, W. K., & Hinshaw, S. P. (2008). The second special issue on evidence-based
psychosocial treatments for children and adolescents: A 10-year update. Journal
of Clinical Child & Adolescent Psychology, 37, 1-7. doi:10.1080/
15374410701817725
Silverman, W.K., Kurtines, W.M., Ginsburg, G.S., Weems, C.F., Lumpkin, P.W., &
Carmichael, D.H. (1999). Treating anxiety disorders in children with group
cognitive-behavioral therapy: A randomized clinical trial. Journal of Consulting
and Clinical Psychology, 67, 995-1003. doi:10.1037/0022-006X.67.6.995
Silverman, W.K., Kurtines, W.M., Ginsburg, G.S., Weems, C.F., Rabian, B., & Serafini,
L.T. (1999). Contingency management, self-control, and education support in the
treatment of childhood phobic disorders: A randomized clinical trial. Journal of
Consulting and Clinical Psychology, 67, 675-687. doi:10.1037/0022-006X.67.5.
675
Silverman, W. K., Pina, A. A., & Viswesvaran. (2008). Evidence-based psychosocial
treatments for phobic and anxiety disorders in children and adolescents. Journal
of Clinical Child and Adolescent Psychology, 37, 105-130. doi: 10.1080/
15374410701817907
Smith, D. K. (1984). Practicing school psychologists: Their characteristics, activities, and
populations served. Professional Psychology Research and Practice, 15, 798-810.
203
doi:10.1037/0735-7028.15.6.798
Smith, T., Lovaas, N.W., & Lovaas, O.I. (2002). Behaviors of children with highfunctioning autism when paired with typically developing versus delayed peers:
A preliminary study. Behavioral Interventions, 17, 129-143. doi:10.1002/bin.114
Sonuga-Barke, E.J.S., Daley, D., Thompson, M., Laver-Bradbury, C., & Weeks, A.
(2001). Parent-based therapies for preschool attention-deficit/hyperactivity
disorder: A randomized, controlled trial with a community sample. Journal of
the American Academy of Child and Adolescent Psychiatry, 40, 402-408. doi:
10.1097/00004583-200104000-00008
Spence, S.H., Donovan, C., & Brechman-Toussaint, M. (2000). The treatment of
childhood social phobia: The effectiveness of a social skills training-based,
cognitive-behavioral intervention, with and without parental involvement.
Journal of Child Psychology and Psychiatry, 41, 713-726. doi:10.1111/14697610.00659
Spence, S.H., Holmes, J.M., March, S., & Lipp, O.V. (2006). The feasibility and outcome
of clinic plus internet delivery of cognitive-behavior therapy for childhood
anxiety. Journal of Consulting and Clinical Psychology, 74, 614-621. doi:
10.1037/0022-006X.74.3.614
Stark, K.D., Reynolds, W.M., & Kaslow, N.J. (1987). A comparison of the relative
efficacy of self-control therapy and behavior problem-solving therapy for
depression in children. Journal of Abnormal Child Psychology, 15, 91-113.
doi:10.1007/BF00916468
Stark, K.D., Rouse, L., & Livingston, R. (1991). Treatment of depression during
204
205
206
Treptow, M.A., Burns, M.K., & McComas, J.J. (2006). Reading at the frustration,
instructional, and independent levels: Effects on student time on task and
comprehension. School Psychology Review, 36, 159-166.
Upah, K. R. F. (2008). Best practices in designing, implementing, and evaluating quality
interventions. In A. Thomas & J. Grimes (Eds.), Best practices in school
psychology (5th ed., pp.209-220). Bethesda, MD: National Association of School
Psychologists.
U.S. Department of Education. (2003). Proven methods: Questions and answers on No
Child Left Behind. Retrieved from http://www2.ed.gov/nclb/methods/whatworks/
doing.html
U.S. Department of Education. (2009). Race to the Top program executive summary.
Retrieved from http://www2.ed.gov/programs/racetothetop/executivesummary.pdf
U. S. Department of Education (2010). An overview of the U. S. Department of
Education. Retrieved from http://www2.ed.gov/about/overview/focus/what.html
U.S. Department of Education Institute of Education Sciences. (2011). National
Assessment of Education Progress (NAEP). Retrieved from http://nces.ed.gov/
nationsreportcard/about/
U.S. Department of Education, Office of Planning, Evaluation and Policy Development.
(2010). ESEA Blueprint for reform. Retrieved from http://www2.ed.gov/policy/
Elsec/leg/blueprint/blueprint.pdf
U. S. Department of Health and Human Services. (1999). Mental health: A report of the
Surgeon General. Rockville, MD: Author.
207
208
outcomes for youth with problem alcohol use. Paper presented at the 2005 Joint
Meeting on Adolescent Effectiveness, Washington, DC.
Waldron, H.B., Slesnick, N., Brody, J.L., Turner, C.W., & Peterson, T.R. (2001).
Treatment outcomes for adolescent substance abuse at 4- and 7-month
assessments. Journal of Consulting and Clinical Psychology, 69, 802-813.
doi:10.1037/0022-006X.69.5.802
Waldron, H. B., & Turner, C. W. (2008). Evidence-based psychosocial treatments for
adolescent substance abuse. Clinical Child and Adolescent Psychology, 37, 238261. doi: 10.1080/15374410701820133
Walker, H. M., Horner, R. H., Sugai, G., Bullis, M., Sprague, J., Bricker, D.
(1996). Integrated approaches to preventing antisocial behavior patterns among
school-age children and youth. Journal of Emotional and Behavioral Disorders,
4, 194-209. doi:10.1177/106342669600400401
Walker, H.M., Kavanagh, K., Stiller, B., Golly, A., Seversen, H.H., & Feil, E.G. (1998).
First step to success: An early intervention approach for preventing school
antisocial behavior. Journal of Emotional and Behavioral Disorders, 6, 66-80.
doi:10.1177/106342669800600201
Wallin, J. E. W. (1914). The mental health of the school child. New Haven, CT: Yale
University Press.
Wallin, J. E. W., & Ferguson, D.G. (1967). The development of school psychological
services. In J. F. Magary (Ed.), School psychological services in theory and
practice: A handbook (pp. 1-29). Englewood Cliffs, NJ: Prentice-Hall.
Wang, M. C. Haertel, G. D., & Walberg, H. J. (1990). What influences leaning? A
209
210
Weisz, J.R., Thurber, C., Sweeney, L., Profitt, V., & LeGagnoux, G. (1997). Brief
treatment of mild to moderate child depression using primary and secondary
control enhancement training. Journal of Consulting and Clinical Psychology,
65, 703-707. doi:10.1037/0022-006X.65.4.703
Wells, K.C., & Egan, J. (1988). Social learning and systems family therapy for childhood
oppositional disorder: Comparative treatment outcome. Comprehensive
Psychiatry, 29, 138-146. doi:10.1016/0010-440X(88)90006-5
West, A.E., Jacobs, R.H., Westerholm, R., Lee, A., Carbray, J., Heidenreich, & Pavuluri,
M.N. (2009). Child and family-focused cognitive-behavioral therapy for pediatric
bipolar disorder: Pilot study of group treatment format. Journal of the Canadian
Academy of Child and Adolescent Psychiatry, 18(3), 239-246.
Williams, S. (2010). RTI: Practical strategies for school psychologists (grades K-12).
Medina, WA: Institute for Educational Development.
Witmer, L. (1897). The organization of practical work in psychology. Psychological
Review, 4, 116-117.
Wolf, M.M. (1978). Social validity: The case for subjective measurement, or how applied
behavior analysis is finding its heart. Journal of Applied Behavior Analysis, 11,
203-214. doi:10.1901/jaba.1978.11-203
Wood, A., Harrington, R., & Moore, A. (1996). Controlled trial of a brief cognitivebehavioral intervention in adolescent patients with depressive disorders. Journal
of Child Psychology and Psychiatry, 37, 737-746. doi:10.1111/j.1469-7610.1996
.tb01466.x
Wood, J.J., Piacentini, J.C., Southam-Gerow, M., Chu, B., & Sigman, M. (2006). Family
211
cognitive behavioral therapy for child and anxiety disorders. Journal of the
American Academy of Child and Adolescent Psychiatry, 45, 314-321. doi:10.1097
/01.chi.0000196425.88341.b0
Wright, P. W. D., & Wright, P. D. (2009). Special education law (2nd ed.) Hartfield, VA:
Harbor House Law Press.
Yates, M. A. (2003). A survey of the counseling practices of school psychologists.
(Unpublished doctoral dissertation). University at Albany, State University of
New York, Albany, NY.
Yeaton, W.H., & Schrest, L. (1981). Critical dimensions in the choice and maintenance
of successful treatments: Strength, integrity, and effectiveness. Journal of
Consulting and Clinical Psychology, 49, 156-167. doi:10.1037/0022-006X.49.2.
156
Young, M. E. & Fristad, M.A., (2007). Evidence based treatments for bipolar disorder in
children and adolescents. Journal of Contemporary Psychotherapy, 37, 157-164.
doi:10.1007/s10879-007-9050-4
Yu, D.L., & Seligman, M.E.P. (2002). Preventing depressive symptoms in Chinese
children. Prevention and Treatment, 5, Article 9.
212
__________ Assessment
__________ Direct Interventions
__________ Consultation and Indirect services
__________ Research
__________ Administration
__________ Systems-level activities
__________ Other
8) In what type of school do you primarily work?
o Rural
o Suburban
o Urban
o Mixed
o Other (please specify)
______________________________
9) What is the psychologist:student ratio at your school district?
o 1:<500
o 1:500-999
o 1:1000-1499
o 1:1500-2000
o 1:>2000
10) What region do you work in?
o Northeast (Connecticut, Maine, Massachusetts, New Hampshire, Rhode
Island, Vermont, New Jersey, New York, Pennsylvania)
o Midwest (Indiana, Illinois, Michigan, Ohio, Wisconsin, Iowa, Nebraska,
Kansas, North Dakota, Minnesota, South Dakota, Missouri)
o South (Delaware, District of Columbia, Florida, Georgia, Maryland, North
Carolina, South Carolina, Virginia, West Virginia, Alabama, Kentucky,
Mississippi, Tennessee, Arkansas, Louisiana, Oklahoma, Texas)
o West (Arizona, Colorado, Idaho, New Mexico, Montana, Utah, Nevada,
Wyoming, Alaska, California, Hawaii, Oregon, Washington)
11) Have you attended any continuing education programs over the past 5 years that
were specifically focused on (check all that apply):
o The No Child Left Behind Act
o Accountability for student academic and behavioral outcomes
o The provision of counseling services
o Evidence-based behavioral interventions
o Data-based decision making
o Response to Intervention
o Other (please specify)
______________________________
12) How many years has it been since you received your last degree?
o 0-5
o 6-10
o >10
13) Was your graduate program accredited by (check all that apply):
o NASP
214
o APA
o NCATE
o Your state
o Not accredited
14) What is the highest degree that you have earned?
o MA/MS
o Certificate/Specialist
o PhD/PsyD/EdD
o Other (please specify)
______________________________
15) Did your graduate academic training include specific coursework in the following
areas (please check all that apply):
o Academic interventions
o Behavioral interventions
o Counseling and Psychotherapy with children
o Counseling children with developmental disabilities
o Group counseling
o Multicultural counseling
16) When planning, implementing, or up-dating a counseling intervention, do you use
any print or online resources (e.g., IEP Pro, IEP Direct) to help you write goals,
clarify the problem, or determine expectations for the student?
o Yes
o No
17) When planning for a counseling intervention, do you come up with a behavioral
definition of the problem?
o Yes
o No
When coming up with a definition of the behavior to be addressed in counseling,
what factors do you include?
Yes
No
18) Action verbs describing
o
o
what the student does in
observable terms
19) Frequency (the number of
o
o
times the behavior occurs
during an observation
period
20) Latency (how much time
o
o
passes between the
presentation of a stimulus
and the students response
or behavior)
21) Intensity (the strength or
o
o
force with which the
behavior is displayed)
22) Topography (the
o
o
215
configuration, shape, or
form of the behavior)
23) Accuracy (a measure of
how the students
behavior is correct or fits
a standard)
24) Duration (how much time
passes between the onset
and the ending of a
behavior)
25) When planning for a counseling intervention, do you collect baseline data before
beginning the intervention?
o Yes
o No
When you collect baseline data, how often do you use each of the following
techniques?
Never
Sometimes
Always
26) Direct behavioral
o
o
o
observation
27) 3rd party behavior
o
o
o
rating (from parent,
teacher, or related
service provider)
28) Sociometric
o
o
o
techniques
29) 3rd party interview
o
o
o
30) Objective self-report
o
o
o
31) Projectiveo
o
o
expressive technique
32) On average, how many baseline data points do you collect in order to establish a
stable pattern of the students behavior?
o 1
o 2
o 3
o 4
o 5
o 6
o 7 or greater
33) When planning for a counseling intervention, do you validate the problem
behavior by comparing the identified student with a peer or a standard of
performance?
o Yes
o No
34) When planning for a counseling intervention, do you analyze the problem
216
validated in the
literature on
evidence-based
interventions
43) A description of
the specific steps
and activities that
will be engaged in
during counseling
sessions
44) A description of
how each step or
activity will be
completed
45) The materials
needed for each
step or activity
46) A description of
what each person
engaged in the
activity will do
47) The location
where the
intervention is to
take place
48) When planning for a counseling intervention, do you come up with a plan to
measure the problem behavior?
o Yes
o No
When coming up with a plan for measuring the target behavior, how often do you
include each of the following components?
Never
Sometimes
Always
49) A behavioral
o
o
o
definition of
the target
behavior
50) A clear
o
o
o
description
of where the
behavior will
be measured
51) A clear
o
o
o
description
of when the
behavior will
218
be measured
52) A clear
delineation
of who will
measure the
behavior
53) A
description
of the
recording
method most
appropriate
for the
behavior
54) A
description
of the most
appropriate
recording
measure
55) When planning for an intervention, do you come up with a decision-making plan
for determining how behavioral data on the student will be collected and
interpreted?
o Yes
o No
When developing a decision-making plan, how often do you use each of the
following components?
Never
Sometimes
Always
56) A
o
o
o
determination
of the
frequency of
behavioral
measurements
and data to be
collected
57) A decision on
o
o
o
how the data
will be
summarized
for the
purposes of
intervention
evaluation
(e.g., visual
219
presentation,
written report
or summary)
58) A
determination
of how many
behavioral
data points
will be
collected
before the
intervention
data will be
analyzed
59) A
determination
of how much
time will pass
before the
intervention
data will be
analyzed
60) A set of
decision rules
for responding
to specific
data points
techniques
65) Interviews
66) Objective selfreport
measures
67) Projectiveexpressive
techniques
o
o
o
o
o
o
68) On average, how many progress monitoring data points do you collected to
established a stable pattern of the students behavior?
o 1
o 2
o 3
o 4
o 5
o 6
o 7
o 8
o >8
69) Do you use the same method for collecting baseline data points as you do for
collecting progress monitoring data points?
o Yes
o Sometimes, depending on the situation
o No
70) During the implementation of a counseling intervention, do you engage in any
formative assessment of the students behavior?
o Yes
o No
When you engage in formative assessment of the students behavior, what sources
of data do you consider?
Yes
No
71) The level of the
o
o
behavior (how much
the behavior is
occurring during
baseline and
intervention phases
as judged by
repeated, objective
measurements of its
frequency, duration,
intensity, or the
percentage of
intervals in which it
occurs)
221
222
behavioral referrals)
88) Thank you for taking the time to complete this survey! If you have any feedback
or comments related to this experience that you would like to share with the
researchers, please feel free to enter it here.
Thank you!
If you would like to have your name entered in to a raffle for one of two $50.00 gift
certificates, please click this link: https://www.psychdata.com/s.asp?SID=143237. You
will then be prompted to provide your name and email address. The information you
provided on the survey will in no way be connected with your contact information.
Winners will be chosen at random and notified once all data have been collected.
If you have any further questions, please feel free to contact me or my research advisor.
Rebecca Cole, M.S.
Doctoral Candidate
School Psychology
rmcole@albany.edu
Deborah Kundert
Dissertation Chair
School Psychology
dkundert@albany.edu
224
225
226
227
8)
228
2) Approximately how
many students do
school psychologists
recommend
declassifying from
counseling each year,
and what reasons are
most commonly cited
when making this
recommendation?
228
Percentages
Frequencies
Data Analyses
Percentages
229
9)
8)
Research Question
229
Data Analyses
230
Research Question
230
231
Research Question
231
Data Analyses
232
Research Question
232
Data Analyses
233
233
234
Research Question
234
Data Analyses
235
Research Question
Data Analyses
235
32) On average, how many baseline data points do you collect in order to
establish a stable pattern of the students behavior?
o 1
o 2
o 3
o 4
When you collect baseline data, how often do you use each of the
following techniques? (Never, Sometimes, Always)
26) Direct behavioral observation
27) 3rd party behavior rating (from parent, teacher, or related service provider)
28) Sociometric techniques
29) 3rd party interview
30) Objective self-report
31) Projective-expressive technique
o No
When coming up with a definition of the behavior to be addressed in
Percentages
counseling, what factors do you include? (Yes, No)
18) Action verbs describing what the student does in observable terms
19) Frequency (the number of times the behavior occurs during an observation
period
20) Latency (how much time passes between the presentation of a stimulus and
the students response or behavior)
21) Intensity (the strength or force with which the behavior is displayed)
22) Topography (the configuration, shape, or form of the behavior)
23) Accuracy (a measure of how the students behavior is correct or fits a
standard)
24) Duration (how much time passes between the onset and the ending of a
behavior)
236
Research Question
Data Analyses
236
When coming up with a plan for measuring the target behavior, how often
do you include each of the following components? (Never, Sometimes,
Always)
49) A behavioral definition of the target behavior
When writing a counseling intervention plan, how often to you use each
of the following components? (Never, Sometimes, Always)
41) A clear description of the procedures to be used
42) Documentation that the strategies to be used have been empirically validated in
the literature on evidence-based interventions
43) A description of the specific steps and activities that will be engaged in during
counseling sessions
44) A description of how each step or activity will be completed
45) The materials needed for each step or activity
46) A description of what each person engaged in the activity will do
47) The location where the intervention is to take place
o 5
o 6
o 7 or greater
237
Research Question
237
When you collect progress monitoring data, how often do you use each of
the following techniques? (Never, Sometimes, Always)
62) Direct behavioral observation
63) 3rd party behavior rating scales (from parent, teacher, or related service
provider)
64) Sociometric techniques
65) Interviews
66) Objective self-report measures
67) Projective-expressive techniques
238
Research Question
Data Analyses
238
69) Do you use the same method for collecting baseline data points as you
do for collecting progress monitoring data points?
o Yes
o Sometimes, depending on the situation
o No
239
239
Percentages,
Chi Square
Analyses
Research Question
240
Research Question
between school
psychologists in
terms of their use of
the general steps of
the problem-solving
model when
designing and
implementing
counseling as a direct
intervention?
240
Data Analyses
241
Research Question
241
242
Research Question
242
Data Analyses
Total
27.0
27.7
-0.1
55.0
55.4
-0.1
51.0
54.5
-0.5
39.0
33.6
0.9
21.0
21.8
-0.2
193
193
6.0
5.3
0.3
11.0
10.6
0.1
14.0
10.5
1.1
1.0
6.4
-2.1
5.0
4.2
0.4
37
37
33.0
33.0
66.0
66.0
65.0
65.0
40.0
40.0
26.0
26.0
230
230
Chi Square Analysis Comparing Use of Behavioral Goals and Psychologist:Student Ratio
1:<500
Set behavioral goal(s)
Observed
Expected
Std. Residual
Do Not Set behavioral goal(s)
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 5.43, df=4, Sig.=0.246
Psychologist:Student Ratio
1:500- 1:1000- 1:1500999
1499
2000 1:>2000
Total
27.0
27.0
0.0
59.0
54.0
0.7
52.0
54.0
-0.3
29.0
32.7
-0.7
22.0
21.3
0.2
189
189
6.0
6.0
0.0
7.0
12.0
-1.4
14.0
12.0
0.6
11.0
7.3
1.4
4.0
4.7
-0.3
42
42
33.0
33.0
66.0
66.0
66.0
66.0
40.0
40.0
26.0
26.0
231
231
243
Chi Square Analysis Comparing Use of Intervention Plans and Years of Experience
0-5
Years of Experience
6-10
>10
Total
76.0
81.7
-0.6
37.0
37.4
-0.1
81.0
74.9
0.7
194
194
20.0
14.3
1.5
7.0
6.6
0.2
7.0
13.1
-1.7
34
34
96.0
96.0
44.0
44.0
88.0
88.0
228
228
Chi Square Analysis Comparing Use of Decision-Making Plans and Time Spent
Counseling
Low
(0-9%)
Use Decision-Making Plans
Observed
Expected
Std. Residual
Do Not Use DecisionMaking Plans
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 6.15, df=2, Sig.=0.046
Total
41.0
45.3
-0.6
76.0
67.6
1.0
37.0
41.1
-0.6
154
154
24.0
19.7
1.0
21.0
29.4
-1.6
22.0
17.9
1.0
67
67
65.0
65.0
97.0
97.0
59.0
59.0
221
221
244
Chi Square Analysis Comparing Use of Problem Analysis and Graduate Degree Earned
Degree Level
MA, MS,
Specialist,
Certificate
Use Problem Analysis
Observed
Expected
Std. Residual
Do Not Use Problem Analysis
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 0.03, df=1, Sig.=0.859
PhD/PsyD/
EdD
Total
122.0
122.5
0.0
40.0
39.5
0.1
162
162
55.0
54.5
0.1
17.0
17.5
-0.1
72
72
177.0
177.0
57.0
57.0
234
234
Chi Square Analysis Comparing Use of Problem Analysis and Years of Experience
0-5
Years of Experience
6-10
>10
Total
68.0
69.2
-0.1
30.0
30.5
-0.1
64.0
62.3
0.2
162
162
32.0
30.8
0.2
14.0
13.5
0.1
26.0
27.7
-0.3
72
72
100.0
100.0
44.0
44.0
90.0
90.0
234
234
245
Chi Square Analysis Comparing Students Served in Counseling and Years of Experience
Students
Special Education
Observed
Expected
Std. Residual
General Education
Observed
Expected
Std. Residual
Special and General Education
Observed
Expected
Std. Residual
Total
Observed
Expected
Note: 2= 9.76, df=4, Sig.=0.045
0-5
Years of Experience
6-10
>10
Total
33.0
26.1
1.4
12.0
11.9
0.0
16.0
23.0
-1.5
61
61
0.0
1.3
-1.1
0.0
0.6
-0.8
3.0
1.1
1.8
3
3
70.0
75.6
-0.6
35.0
34.5
0.1
72.0
66.8
0.6
177
177
103.0
103.0
47.0
47.0
91.0
91.0
241
241
246
Chi Square Analysis Comparing Number of Students Discontinued From Counseling and
Time Spent Counseling
Number of Students
0
Observed
Expected
Std. Residual
1-3
Observed
Expected
Std. Residual
4-6
Observed
Expected
Std. Residual
7
Observed
Expected
Std. Residual
Total
Observed
Expected
2
Note: = 12.06, df=6, Sig.=0.061
Low
(0-9%)
Total
8.0
8.3
-0.1
15.0
16.2
-0.3
11.0
9.5
0.5
34
34
37.0
29.3
1.4
57.0
57.1
0.0
26.0
33.5
-1.3
120
120
9.0
10.8
-0.5
18.0
20.9
-0.6
17.0
12.3
1.3
44
44
2.0
7.6
-2.0
19.0
14.8
1.1
10.0
8.7
0.5
31
31
56.0
56.0
109.0
109.0
64.0
64.0
229
229
247
B (SE)
df
Sig.
-1.95 (.195)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.363
.072
.480
.183
.905
.892
.972
.125
.541
.789
.232
.580
.525
.484
.657
.505
0.83
3.24
0.50
3.40
0.01
0.02
0.06
2.35
0.37
0.07
1.43
0.31
0.40
0.49
2.43
11.29
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
169.129
.049
.092
6.949
.542
169.136
.048
.092
6.632
.577
169.274
.048
.091
6.373
.606
169.625
.047
.088
6.033
.644
174.016
.029
.055
2.833
.829
175.434
.023
.044
2.359
.670
178.945
.009
.017
.000
181.116
.000
.000
.000
248
B (SE)
df
Sig.
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.543
.896
.075
.077
.576
.217
.467
.057
.059
.937
.853
.802
.987
.523
.885
.308
-1.24 (.156)
0.37
0.02
3.18
5.13
0.31
1.52
1.52
3.63
3.57
0.01
0.03
0.06
0.00
0.41
1.16
13.89
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
237.406
.058
.088
10.059
.261
238.327
.054
.082
2.057
.957
238.546
.053
.081
2.067
.979
238.960
.051
.078
3.230
.919
240.588
.045
.068
5.744
.570
244.334
.030
.045
2.895
.235
249
B (SE)
df
Sig.
-1.641 (.180)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.652
.655
.334
.621
.527
.583
.788
.983
.716
.686
.742
.909
.170
.011
.132
.610
0.20
0.20
0.94
0.95
0.40
0.30
0.50
0.00
0.13
0.16
0.11
0.01
1.89
6.46
7.07
10.07
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
189.785
.053
.090
5.938
.654
190.096
.052
.088
4.113
.847
190.239
.051
.087
3.076
.930
191.422
.046
.079
3.305
.914
191.768
.045
.076
4.234
.835
192.169
.043
.073
2.594
.858
192.819
.040
.069
.000
1.000
250
B (SE)
df
Sig.
-.800 (.143)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.725
.500
.278
.554
.859
.718
.879
.529
.102
.721
.616
.628
.884
.972
.982
.950
0.12
0.46
1.18
1.18
0.03
0.13
0.26
0.40
2.68
0.13
0.25
0.24
0.02
0.00
0.41
5.24
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
278.216
.023
.033
9.904
.272
278.217
.023
.033
7.155
.520
278.854
.020
.029
5.095
.747
278.993
.020
.028
6.365
.606
279.028
.020
.028
2.276
.893
279.326
.018
.026
0.188
.998
280.874
.012
.016
0.000
283.564
.000
.000
0.000
251
B (SE)
df
Sig.
-1.493 (.171)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.244
.754
.735
.934
.723
.452
.753
.408
.599
.980
.980
.054
.475
.081
.229
.701
1.36
0.10
0.12
0.14
0.13
0.57
0.57
0.68
0.28
0.00
0.00
3.70
0.51
3.05
5.63
9.02
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
208.998
.040
.064
7.951
.438
209.272
.038
.063
2.504
.927
209.308
.038
.062
3.101
.928
209.835
.036
.059
2.627
.917
210.390
.034
.055
3.125
.926
211.354
.030
.048
.634
.966
212.531
.025
.040
.000
1.000
218.246
.000
.000
.000
252
B (SE)
df
Sig.
-1.745 (.189)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.224
.652
.620
.592
.025
.907
.054
.217
.691
.479
.631
.112
.295
.605
.446
.299
1.48
0.20
0.25
1.05
5.02
0.01
5.84
1.53
0.16
0.50
0.23
2.53
1.10
0.27
3.71
14.03
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
172.042
.064
.112
12.237
.141
172.059
.064
.112
6.218
.623
173.155
.059
.104
4.505
.809
176.717
.044
.077
2.321
.940
177.392
.041
.072
1.727
.943
178.388
.036
.064
.094
.993
180.595
.027
.047
.000
1.000
253
B (SE)
df
Sig.
-1.280 (.165)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.154
.608
.067
.154
.397
.582
.425
.894
.204
.133
.655
.728
.790
.650
.857
.587
0.02
0.26
3.36
3.74
0.72
0.30
1.71
0.02
1.61
2.26
0.20
0.12
0.07
0.21
1.33
10.33
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
215.504
.049
.075
7.937
.440
215.560
.049
.075
7.285
.506
215.645
.048
.074
7.975
.436
217.615
.039
.061
6.484
.593
218.773
.034
.053
1.350
.995
219.825
.030
.045
.590
.964
224.011
.011
.016
.000
226.301
.000
.000
.000
254
B (SE)
df
Sig.
-.821 (.148)
.000
0.81
2.38
6.58
6.59
0.95
0.63
1.12
0.13
0.07
0.61
0.01
0.01
1.55
0.46
2.96
13.01
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.369
.123
.010
.037
.329
.429
.572
.717
.786
.433
.943
.912
.214
.499
.564
.368
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
252.367
.061
.086
3.826
.872
252.546
.060
.085
3.964
.860
253.469
.056
.079
7.805
.350
253.735
.055
.077
3.337
.911
257.253
.039
.055
2.218
.899
258.111
.035
.050
.375
.945
259.178
.031
.043
.000
1.000
255
B (SE)
df
Sig.
-1.569 (.186)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.207
.477
.504
.340
.847
.460
.649
.520
.205
.448
.128
.829
.888
.606
.540
.652
1.59
0.51
0.45
2.16
0.04
0.55
0.87
0.41
1.61
0.58
2.32
0.05
0.02
0.27
3.11
9.59
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
176.323
.050
.082
10.425
.236
176.364
.049
.082
11.635
.168
179.016
.037
.061
4.335
.826
179.350
.035
.059
6.920
.545
181.450
.025
.042
7.453
.489
182.281
.021
.035
.899
.925
185.016
.008
.013
.000
186.635
.000
.000
.000
256
B (SE)
df
Sig.
-.221 (.142)
.121
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.017
.741
.861
.947
.003
.075
.010
.135
.249
.318
.969
.880
.214
.342
.666
.080
5.68
0.11
0.03
0.11
8.86
3.17
9.15
2.24
1.33
1.00
0.00
0.02
1.54
0.90
2.38
19.36
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
254.468
.097
.130
7.948
.439
255.408
.093
.124
5.826
.667
255.864
.090
.121
8.885
.352
256.531
.087
.117
10.141
.255
260.551
.069
.092
2.498
.869
261.442
.065
.087
.589
.964
257
B (SE)
df
Sig.
-1.295 (.172)
.000
1
1
1
2
1
1
2
1
1
1
1
1
1
1
4
12
.703
.813
.539
.821
.078
.386
.210
.863
.140
.337
.613
.354
.164
.716
.568
.558
0.15
0.06
0.38
0.39
3.10
0.75
3.12
0.03
2.18
0.92
0.26
0.87
1.93
0.13
2.94
10.67
Block 1
Hosmer and Lemeshow Test
Step
Number
-2 Log
Likelihood
Cox and
Snell R2
Nagelkerke
R2
df
Sig
197.222
.053
.083
3.024
.933
197.236
.053
.082
2.845
.944
197.527
.052
.080
11.255
.188
198.253
.049
.075
12.713
.122
202.547
.028
.043
4.808
.683
202.850
.026
.041
.456
.978
206.013
.011
.017
.000
208.203
.000
.000
.000
258