Table of Contents
Title Page
Copyright Page
Dedication
Preface
ORGANIZATION
SPECIAL FEATURES
Acknowledgments
PART I - OVERVIEW OF EVIDENCE-BASED PRACTICE
Chapter 1 - INTRODUCTION TO EVIDENCE-BASED PRACTICE
EMERGENCE OF EVIDENCE-BASED PRACTICE
DEFINING EVIDENCE-BASED PRACTICE
EVIDENCE-BASED PRACTICE IS NOT RESTRICTED TO CLINICAL DECISIONS
DEVELOPING AN EVIDENCE-BASED PRACTICE PROCESS OUTLOOK
EASIER SAID THAN DONE
ADDITIONAL READINGS
Chapter 2 - STEPS IN THE EBP PROCESS
STEP 1: QUESTION FORMULATION
STEP 2: EVIDENCE SEARCH
STEP 3: CRITICALLY APPRAISING STUDIES AND REVIEWS
STEP 4: SELECTING AND IMPLEMENTING THE INTERVENTION
STEP 5: MONITOR CLIENT PROGRESS
FEASIBILITY CONSTRAINTS
ADDITIONAL READINGS
Chapter 3 - RESEARCH HIERARCHIES
MORE THAN ONE TYPE OF HIERARCHY FOR MORE THAN ONE TYPE OF EBP QUESTION
QUALITATIVE AND QUANTITATIVE STUDIES
TYPES OF EBP QUESTIONS
ADDITIONAL READINGS
PART II - CRITICALLY APPRAISING STUDIES FOR EBP QUESTIONS ABOUT INTERVENTION EFFECTIVENESS
Chapter 4 - CRITERIA FOR INFERRING EFFECTIVENESS
INTERNAL VALIDITY
MEASUREMENT ISSUES
STATISTICAL CHANCE
EXTERNAL VALIDITY
ADDITIONAL READINGS
Chapter 5 - CRITICALLY APPRAISING EXPERIMENTS
CLASSIC PRETEST-POSTTEST CONTROL GROUP DESIGN
POSTTEST-ONLY CONTROL GROUP DESIGN
SOLOMON FOUR-GROUP DESIGN
ALTERNATIVE TREATMENT DESIGNS
DISMANTLING DESIGNS
PLACEBO CONTROL GROUP DESIGNS
EXPERIMENTAL DEMAND AND EXPERIMENTER EXPECTANCIES
OBTRUSIVE VERSUS UNOBTRUSIVE OBSERVATION
COMPENSATORY EQUALIZATION AND COMPENSATORY RIVALRY
RESENTFUL DEMORALIZATION
TREATMENT DIFFUSION
TREATMENT FIDELITY
PRACTITIONER EQUIVALENCE
DIFFERENTIAL ATTRITION
ADDITIONAL READINGS
Chapter 6 - CRITICALLY APPRAISING QUASI-EXPERIMENTS: NONEQUIVALENT COMPARISON ...
NONEQUIVALENT COMPARISON GROUPS DESIGNS
ADDITIONAL LOGICAL ARRANGEMENTS TO CONTROL FOR POTENTIAL SELECTIVITY BIASES
STATISTICAL CONTROLS FOR POTENTIAL SELECTIVITY BIASES
PILOT STUDIES
ADDITIONAL READINGS
Chapter 7 - CRITICALLY APPRAISING QUASI-EXPERIMENTS: TIME-SERIES DESIGNS AND ...
SIMPLE TIME-SERIES DESIGNS
MULTIPLE TIME-SERIES DESIGNS
SINGLE-CASE DESIGNS
ADDITIONAL READING
Chapter 8 - CRITICALLY APPRAISING SYSTEMATIC REVIEWS AND META-ANALYSES
ADVANTAGES OF SYSTEMATIC REVIEWS AND META-ANALYSES
RISKS IN RELYING EXCLUSIVELY ON SYSTEMATIC REVIEWS AND META-ANALYSES
WHERE TO START
WHAT TO LOOK FOR WHEN CRITICALLY APPRAISING SYSTEMATIC REVIEWS
WHAT DISTINGUISHES A SYSTEMATIC REVIEW FROM OTHER TYPES OF REVIEWS?
WHAT TO LOOK FOR WHEN CRITICALLY APPRAISING META-ANALYSES
ADDITIONAL READINGS
PART III - CRITICALLY APPRAISING STUDIES FOR ALTERNATIVE EBP QUESTIONS
Chapter 9 - CRITICALLY APPRAISING NONEXPERIMENTAL QUANTITATIVE STUDIES
SURVEYS
CROSS-SECTIONAL AND LONGITUDINAL STUDIES
CASE-CONTROL STUDIES
ADDITIONAL READINGS
Chapter 10 - CRITICALLY APPRAISING QUALITATIVE STUDIES
QUALITATIVE OBSERVATION
QUALITATIVE INTERVIEWING
QUALITATIVE SAMPLING
GROUNDED THEORY
FRAMEWORKS FOR APPRAISING QUALITATIVE STUDIES
ADDITIONAL READINGS
PART IV - ASSESSING CLIENTS AND MONITORING THEIR PROGRESS
Chapter 11 - CRITICALLY APPRAISING AND SELECTING ASSESSMENT INSTRUMENTS
RELIABILITY
VALIDITY
SENSITIVITY
FEASIBILITY
SAMPLE CHARACTERISTICS
LOCATING ASSESSMENT INSTRUMENTS
ADDITIONAL READINGS
Chapter 12 - MONITORING CLIENT PROGRESS
A PRACTITIONER-FRIENDLY DESIGN
FEASIBLE ASSESSMENT TECHNIQUES
SUMMARY
LOOKING AHEAD
ADDITIONAL READING
Appendix A - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 4
Appendix B - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 5
Appendix C - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 6
Appendix D - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 7
Appendix E - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 8
Appendix F - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 9
Appendix G - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 10
Appendix H - CRITICAL APPRAISALS OF STUDY SYNOPSES AT THE END OF CHAPTER 11
Glossary
References
Index
This book is printed on acid-free paper.
Copyright © 2008 by John Wiley & Sons, Inc. Al l rights reserved.
Published by John Wiley & Sons, Inc., Hoboken, New Jersey.
Published simultaneously in Canada.
Wiley Bicentennial Logo: Richard J. Pacifico
No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400, fax (978) 646-8600, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008, or online at http://www.wiley.com/go/permissions.
Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
This publication is designed to provide accurate and authoritative information in regard to the subject matter covered. It is sold with the understanding that the publisher is not engaged in rendering professional services. If legal, accounting, medical, psychological or any other expert assistance is required, the services of a competent professional person should be sought.
Designations used by companies to distinguish their products are often claimed as trademarks. In all instances where John Wiley & Sons, Inc. is aware of a claim, the product names appear in initial capital or all capital letters. Readers, however, should contact the appropriate companies for more complete information regarding trademarks and registration.
For general information on our other products and services please contact our Customer Care Department within the United States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books. For more information about Wiley products, visit our web site at www.wiley.com.
Library of Congress Cataloging-in-Publication Data:
Rubin, Allen.
Practitioner’s guide to using research for evidence-based practice / by Allen Rubin.
p. ; cm.
Includes bibliographical references and index.
ISBN 978-0-470-13665-2 (cloth : paper)
1. Psychotherapy—Research—Methodology. 2. Psychotherapy—Evaluation—Methodology. 3. Evidence-based psychiatry. I. Title.
[DNLM: 1. Psychotherapy. 2. Evaluation Studies. 3. Outcome and Process Assessment (Health Care) 4. Research Design. WM 420 R8948p 2008]
RC337.R73 2008
616.89′ 140072—dc22
2007013693
To human service practitioners whose compassion and professionalism spur them to persevere, despite limited support, to seek and critically appraise research evidence so that they can maximize the chances that their efforts will be effective in helping people in need.
Preface
Helping professionals these days are hearing a great deal about evidence-based practice (EBP) and are experiencing increasing pressure to engage in it. In fact, EBP has become part of the definition of ethical practice.
Accompanying the growth in the popularity of EBP in the human services field is a growing concern about how rarely practitioners engage in the EBP process. Various pragmatic factors have been cited regarding this concern, such as time constraints and lack of agency access to bibliographic databases. Another factor is that practitioners typically do not retain the research knowledge that they learned as students. Many practitioners, therefore, are likely to feel unable to implement the EBP process because they feel incapable of appraising accurately the quality of research studies.
There are various reasons why practitioners may not retain the research knowledge that they learned as a student. One is simply the passage of time. Exacerbating that factor is that in their early careers they are unlikely to experience expectations from superiors that they use the research knowledge they gained in school. Another factor is the way that research courses may have been taught. Typically, the emphasis in teaching research has been more on how to do research in the role of researcher than on appraising and using research in the role of a practitioner who is engaged in EBP. Little wonder, then, that so many students who aspire to be service providers—and not researchers—lack enthusiasm for their research courses and soon forget much of what they learned in them.
Consequently, when service providers attempt to heed the call to engage in EBP by finding and appraising research studies, practitioners are likely to experience difficulty in differentiating between those studies that contain reasonable limitations and those that contain fatal flaws. That is, they are likely to feel unable to judge whether a study’s limitations merely imply regarding the study with some caution or disregarding it as too egregiously flawed to be worthy of guiding their practice. Lacking confidence in this judgment, it’s easy for practitioners to feel discouraged about engaging in EBP.
This book attempts to alleviate that problem. Rather than discussing research from the standpoint of preparing to do research, it provides a practitioner-oriented guide to appraising and using research as part of the EBP process. Current and future practitioners can use this book as a user-friendly reference to help them engage in all the steps of the EBP process, including that step in which they must differentiate between acceptable methodological research limitations and fatal flaws and accurately judge the degree of caution warranted in considering whether a study’s findings merit guiding practice decisions.
By maintaining a constant focus on explaining in a practitioner-friendly manner how to appraise and use research in the context of the EBP process, this book can help readers feel that they are learning about research concepts relevant to their practice—research concepts that can help them improve their implementation of EBP. In turn, the book attempts to empower and motivate readers to engage in that process.
Although most of the book’s contents focus on critically appraising research to answer EBP questions, its final chapter simplifies the process of practitioner use of research methods to evaluate their own practice. That’s because the final step in the EBP process requires that practitioners employ research techniques to monitor client progress and evaluate whether their client achieved the desired outcome. However, unlike other texts that emphasize rigor in pursuit of causal inferences in single-case designs, the final chapter of this book is based on the premise that the practitioner is just assessing whether clients appear to be benefiting from an intervention whose probabilistic effectiveness has already been supported in the studies examined by the practitioner in the EBP process of searching for and appraising existing evidence. Thus, the emphasis in the final chapter is on feasibility. In light of the much-researched problem of practitioners eschewing the application of single-case designs in their practice, this book’s unique emphasis is intended to increase the extent to which practitioners will use single-case design methods to monitor client progress.
In summary, this book aims to provide human services practitioners what they need to know about various research designs and methods so that when engaging in the EBP process they can:
• Determine which interventions, programs, policies, and assessment tools are supported by the best evidence.
• Find and critically appraise qualitative and quantitative research studies in seeking evidence to answer different kinds of EBP questions.
• Differentiate between acceptable limitations and fatal flaws in judging whether studies at various positions on alternative research hierarchies (depending on the EBP question being asked) merit being used with caution in guiding their practice.
• Assess treatment progress with chosen interventions in a feasible manner as part of the final stage of EBP.
ORGANIZATION
The first part of this book contains three chapters that provide a backdrop for the rest of the book. Chapter 1 shows why it’s important for readers to learn about research methods from the standpoint of becoming evidence-based practitioners, briefly reviews the history of EBP, defines EBP, discusses the need to develop an EBP outlook and describes what that outlook means, discusses feasibility constraints practitioners face in trying to engage in the EBP process, and offers suggestions for making the various steps in the process more feasible for them.
Chapter 2 describes the steps of the EBP process—including how to formulate an EBP question and how to search for evidence bearing on that question and to do so feasibly. Overviews are provided of subsequent steps—steps that are discussed in more depth in subsequent chapters. As in other chapters, Chapter 2 ends with a focus on feasibility issues.
One of the most controversial and misunderstood aspects of EBP concerns hierarchies for evaluating sources of evidence. Some think that there is only one hierarchy for appraising research and guiding practice. Some believe that unless a study meets all the criteria of the gold standard of randomized clinical trials (RCTs), then it is not worthy of guiding practice. Others are offended by the notion of an EBP research hierarchy and believe it devalues qualitative inquiry and nonexperimental research, such as multivariate correlational studies using cross-sectional, case-control, or longitudinal designs.
Chapter 3 attempts to alleviate this controversy and misunderstanding by discussing the need to conceptualize multiple research hierarchies for different types of EBP questions. It explains how and why certain kinds of designs belong at or near the top of one hierarchy yet at or near the bottom of another hierarchy. Thus, the chapter provides examples of EBP research questions for which qualitative studies deserve to be at the top of a research hierarchy for some questions and near the bottom for others and likewise why RCTs belong near the top or bottom of hierarchies depending on the EBP question being asked.
Part II delves into what practitioners need to know so that they can critically appraise studies pertinent to EBP questions about the effectiveness of interventions, programs, or policies. Chapter 4 sets the stage for the remaining four chapters in this section by discussing criteria for inferring effectiveness, including such concepts as internal and external validity, measurement issues, and statistical chance.
Chapter 5 describes the nature and logic of experiments and how to critically appraise them. It does not address the conducting of experiments. Instead, it emphasizes what features to look for in appraising an experiment that might represent minor or fatal flaws despite random assignment. Those features include measurement biases and attrition biases that can lead to erroneous conclusions that an intervention is effective as well as things like diffusion and resentful demoralization that can lead to erroneous conclusions that an intervention is ineffective.
Chapters 6 and 7 describe the nature and logic of quasi-experiments and how to critically appraise them. These chapters do not delve into how to implement them. Instead, they emphasize what features to look for in appraising a quasi-experiment that might represent minor or fatal flaws or that might be important strengths to help offset the lack of random assignment.
Chapter 6 focuses on critically appraising nonequivalent comparison groups designs. It distinguishes between those designs and pre-experimental pilot studies and discusses how the two sometimes are mistakenly equated. It discusses the potential value of pilot studies to practitioners when more conclusive sources of evidence that apply to their EBP question are not available. It also alerts practitioners to the ways in which authors of pre-experimental studies can mislead readers by discussing their findings as if they offer stronger grounds than is warranted for calling the intervention, program, or policy they studied evidence-based. Practitioner-friendly statistical concepts are discussed at a conceptual level, providing readers what they’ll need to know to understand the practical implications of—and not get overwhelmed by—multivariate procedures used to control for possible selectivity biases. Chapter 7 extends the discussion of quasi-experiments by focusing on the critical appraisal of time-series designs and single-case designs.
Chapter 8 discusses how to critically appraise systematic reviews and meta-analyses. It includes content on the advantages of both as well as risks in relying exclusively on them. It also addresses how to find them, key things to look for when critically appraising them, and what distinguishes them from other types of reviews. The meta-analytical statistical concept of effect size is discussed in a practitioner-friendly manner.
Part III turns to the critical appraisal of studies for EBP questions that do not emphasize causality and internal validity. Chapter 9 discusses critically appraising nonexperimental quantitative studies, such as surveys, longitudinal studies, and case-control studies. Chapter 10 then discusses critically appraising qualitative studies. Qualitative studies play an important role in EBP when practitioners seek to gain a deeper understanding of the experiences of people whom they want to help and what those experiences mean to those people. Thus, Chapter 10 includes content on what to look for when critically appraising qualitative observation, qualitative interviewing, qualitative sampling, and grounded theory. Different frameworks for appraising qualitative studies are discussed from the standpoints of empowerment standards, social constructivist standards, and contemporary positivist standards.
The final section of this book, Part IV, contains two chapters that address EBP questions pertaining to assessing clients and monitoring their progress. Chapter 11 discusses how to critically appraise and select assessment instruments. It covers in greater depth and in a practitioner-friendly manner the following concepts that also are addressed (in less depth) in earlier chapters: reliability, validity, sensitivity, and cultural sensitivity. It also shows how to locate assessment instruments and—as with other chapters—emphasizes practitioner and client feasibility.
Chapter 12 turns to feasible ways practitioners can implement aspects of single-case design techniques to monitor client progress as part of the final stage of the EBP process. This chapter is distinguished from the way other sources cover this topic by its emphasis on feasibility. Chapter 12 is based on the premise that when practitioners are providing interventions that already have the best evidence, they don’t need to pursue elaborate designs that are likely to intimidate them and be unfeasible for them in light of their everyday practice realities. Instead of feeling that they must implement designs that have a high degree of internal validity in isolating the intervention as the cause of the client’s improved outcome, they can just monitor progress to check on whether their particular client is achieving a successful outcome or is perhaps among those people who don’t benefit from the intervention. This chapter is distinguished from Chapter 7 in that Chapter 7 focuses on appraising published single-case design studies from the standpoint of finding interventions supported by the best evidence. In keeping with its feasibility emphasis, Chapter 12 proposes the B plus (B+) design. It also illustrates some feasible ways in which practitioners can devise their own measures to monitor client progress.
SPECIAL FEATURES
Chapters 4 through 11 end by presenting two synopses of (mainly fictitious) research studies germane to each chapter’s purpose. Readers can critically appraise each of these 16 synopses—writing down strengths, reasonable flaws, and fatal flaws and indicating whether and how each could be used to guide decisions about evidence-based practice. Eight appendixes (A through H) at the end of the book provide my brief appraisals of each synopsis to which readers can compare their appraisals. Each of those eight appendixes corresponds to the two synopses in a particular chapter. Appendix A, for example, presents my appraisals of the synopses at the end of Chapter 4, Appendix B corresponds to Chapter 5, and so on.
In addition to the synopses, each chapter also ends with a list of key chapter concepts, some review exercises, and some additional readings pertinent to the chapter contents. Terms that appear in bold in the text are defined in a glossary at the end of the book.
I hope you find this book useful. Any suggestions you have for improving it will be appreciated and can be sent to me at
[email protected].
Acknowledgments
Thanks go to the following colleagues who reviewed this book: Kevin Corcoran, PhD, JD, of Portland State University; Jeffrey M. Jenson, PhD, of the University of Denver; Edward J. Mullen, DSW, of Columbia University; Aron Shlonsky, PhD, of the University of Toronto; and Haluk Soydan, PhD, of the University of Southern California. Thanks also go to Danielle Parrish, who, as a doctoral student at the University of Texas at Austin, reviewed chapters, made many extremely helpful suggestions, and provided technical assistance in the preparation of some figures. I also appreciate the support of the following people at Wiley: Lisa Gebo (senior editor), Peggy Alexander (vice president, publisher), and Sweta Gupta (editorial assistant).
PART I
OVERVIEW OF EVIDENCE-BASED PRACTICE
Chapter 1
INTRODUCTION TO EVIDENCE-BASED PRACTICE
You’ve started reading a book about research so you must have some free time. But aren’t there other things you could do right now that are less onerous than reading about research? You could dust your office. You could make that overdue visit to your dentist. Or maybe listen to a Barry Manilow CD. Okay, okay, not Barry Manilow! But read about research? What compelled you to do that?
Actually, that’s a rhetorical question because I think I know the answer, and I’m just trying to connect with you. Start where the reader (i.e., the client) is at, as it were—sort of like building a therapeutic alliance. My hunch is that you’re reading this book because there is significant pressure these days on practitioners to engage in evidence-based practice (EBP), which implies (in part) using research findings to guide their practice decisions. If you are like most of the practitioners I know, you probably resent that pressure. But it’s a reality you must deal with, and perhaps by reading this book you’ll be better prepared to deal with it on your terms. That is, by learning more about how to utilize and appraise EBP research, you’ll be better equipped to understand, question, or negotiate with others—like managed care companies—who cite EBP as the reason they think they know better than you do what you should do in your practice.
Although the term evidence-based practice has become fashionable only recently, the main ideas behind it are really quite old. As early as 1917, for example, in her classic text on social casework, Mary Richmond discussed the use of research-generated facts to guide the provision of direct clinical services as well as social reform efforts.
Also quite old is the skepticism implicit in EBP about the notion that your practice experience and expertise—that is, your practice wisdom—are a sufficient foundation for effective practice. That skepticism does not imply that your practice experience and expertise are irrelevant and unnecessary—just that they alone are not enough.
Perhaps you don’t share that skepticism. In fact, it’s understandable if you even resent it. Many decades ago, when I first began learning about clinical practice, I was taught that to be an effective practitioner I had to believe in my own effectiveness as well as the effectiveness of the interventions I employed. Chances are that you have learned this, too, either in your training or through your own practice experience. It stands to reason that clients will react differently depending on whether they are being served by practitioners who are skeptical about the effectiveness of the interventions they provide versus practitioners who believe in the effectiveness of the interventions and are enthusiastic about them.
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!
Lesen Sie weiter in der vollständigen Ausgabe!