Advancing Evidence-based Practice Through Program Evaluation: A Practical Guide for School-based Professionals

Front Cover
Oxford University Press, 2018 - Behavior modification - 200 pages
Given the current climate of results-driven accountability, school-based professionals have a significant contribution to make in improving the impact of programs and initiatives through the application of program evaluation methods and tools to inform decision making within a multi-tier system of supports framework. And yet there is currently a dearth of practical resources dedicated to developing school psychologists' competencies in program evaluation.

Advancing Evidence-Based Practice through Program Evaluation will meet the needs of school psychologists and other school-based professionals seeking to use program evaluation approaches to enhance data-based decision making and accountability at a program and systems-level. This practical guide provides the most cutting-edge evaluation frameworks, methods, and tools available, with particular emphasis on the rapidly-developing areas of implementation research, evidence-based professional learning, and innovative approaches to communicating evaluation findings. The book will support school professionals in daily practice by enhancing and extending their knowledge and skills in measurement, assessment, consultation for systems change and the use of evidence-based interventions for academic and social/behavioral concerns, with a focus on evaluating the implementation and outcomes of school-based programs. The book will also facilitate the professional development of those currently engaged in graduate preparation programs in education, educational leadership, school counseling, and school social work, as well as the university faculty who guide their professional preparation. Finally, school professionals may also use Advancing Evidence-Based Practice through Program Evaluation to develop their professional competencies in implementing new initiatives funded by grants with clear expectations for program evaluation.


What people are saying - Write a review

We haven't found any reviews in the usual places.


1 Introduction to Program Evaluation
2 Evaluating Implementation
3 Evaluating Professional Learning
4 Developing an Evaluation Plan
5 Communicating Evaluation Findings
6 Case Studies Using Program Evaluation to Drive EvidenceBased Practices
Program Evaluation Standards
Observation Checklist for HighQuality Professional Development Training
Evaluation Planning Template
Assessment Schedule
Responsibility Matrix
District Implementation Team DIT Communication Plan
About the Authors

Norwegian Teacher SelfEfficacy Scale
Example of a Logic Model

Other editions - View all

Common terms and phrases

About the author (2018)

Julie Q. Morrison is an Associate Professor in the School Psychology Program at the University of Cincinnati. Her research interests include evaluating the effectiveness of universal and targeted interventions to address the academic and behavioral needs of school-age children and youth within a multi-tier system of supports framework. She has more than 20 years of experience as an evaluator of educational initiatives implemented at a state-, regional-, and district-level. Currently, she serves on the Joint Committee on Standards in Educational Evaluation ( as a liaison member representing the National Association of School Psychologists.

Anna Harms is the Evaluation and Research Coordinator for Michigan's Integrated Behavior and Learning Support Initiative (MIBLSI). MIBLSI is a statewide initiative designed to support regional educational service agencies, districts, and schools in the development and implementation of evidence-based practices within a multi-tiered delivery system. In her current role, she leads the design and implementation of internal evaluation for MIBLSI. This includes work on three federally funded grants. She coordinates a team that provides support to local districts and schools around multi-tier systems of support (MTSS) measurement and evaluation, including research projects on universal screening, data-based decision making, and assessment construction and validation.

Bibliographic information