The Cycle of Excellence -  - E-Book

The Cycle of Excellence E-Book

0,0
44,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

How do the good become great? Practice! From musicians and executives to physicians and drivers, aspiring professionals rely on deliberate practice to attain expertise. Recently, researchers have explored how psychotherapists can use the same processes to enhance the effectiveness of psychotherapy supervision for career-long professional development. Based on this empirical research, this edited volume brings together leading supervisors and researchers to explore a model for supervision based on behavioral rehearsal with continuous corrective feedback. Demonstrating how this model complements and enhances a traditional, theory-based approach, the authors explore practical methods that readers can use to improve the effectiveness of their own psychotherapy training and supervision.

This book is the 2018 Winner of the American Psychological Association Supervision & Training Section's Outstanding Publication of the Year Award.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 634

Veröffentlichungsjahr: 2017

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Title Page

Copyright

Dedication

About the Editors

List of Contributors

Part I: The Cycle of Excellence

1 Introduction

The Overall Effectiveness of Psychotherapy

Opportunity for Improvement

Current Strategies for Improving Effectiveness

The Science of Expertise

The Cycle of Excellence

How Much Practice Is Enough?

Bringing the Science of Expertise to Psychotherapy

Sources of Motivation to Engage in Deliberate Practice

About This Book

Questions from John Norcross, PhD

References

2 Professional Development: From Oxymoron to Reality

Learning from Experts

Application of Deliberate Practice in Psychotherapy

Making Professional Development a Reality

Questions from the Editors

References

3 What Should We Practice?: A Contextual Model for How Psychotherapy Works

Contextual Model

Characteristics and Actions of Effective Therapists

Conclusions

Questions from the Editors

References

4 Helping Therapists to Each Day Become a Little Better than They Were the Day Before: The Expertise‐Development Model of Supervision and Consultation

Concepts and Context for the Expertise‐Development Model

Expertise‐Development Model of Supervision and Consultation

Additional Issues Related to Implementing the Expertise‐Development Model

Conclusion

Questions from the Editors

References

Part II: Tracking Performance

5 Qualitative Methods for Routine Outcome Measurement

What Is Qualitative Methodology?

Rationale for Qualitative Outcome Assessment

Using Qualitative Feedback and Outcome Data Collection Methods in Routine Clinical Practice

Conclusion: Issues and Challenges in Using Qualitative Outcome and Process Instruments to Inform Deliberate Practice

Questions from the Editors

References

6 Quantitative Performance Systems: Feedback‐Informed Treatment

Overview of Select ROM Systems

Using Client Feedback for Clinical Improvement

Conclusion

Questions from the Editors

References

7 Routine Outcome Monitoring in Child and Adolescent Mental Health in the United Kingdom at the Individual and Systems Levels: Learning from the Child Outcomes Research Consortium

Making Routine Outcome Monitoring a Central Part of Direct Work with Clients

Making Routine Outcome Monitoring a Central Part of Practitioner Development

Making Routine Outcome Monitoring a Central Part of System‐Level Quality Improvement

Conclusion

Questions from the Editors

References

Part III: Applications for Integrating Deliberate Practice into Supervision

8 Some Effective Strategies for the Supervision of Psychodynamic Psychotherapy

Techniques/Processes Associated with Outcome in Psychodynamic Psychotherapy

Techniques/Processes Associated with Therapeutic Alliance in Psychodynamic Psychotherapy

Empirical Research on Graduate Trainees Learning Psychodynamic Psychotherapy

Methods for Training Therapists in Psychodynamic Psychotherapy

Conclusion

Questions from the Editors

References

9 Nurturing Therapeutic Mastery in Cognitive Behavioral Therapy and Beyond: An Interview with Donald Meichenbaum

Interview

References

10 Nurturing Expertise at Mental Health Agencies

Calgary Counselling Centre

Changes in Outcomes at CCC over Time

Costs to Implement

Summary and Conclusion: A Culture of Excellence

Questions from the Editors

References

11 The Ongoing Evolution of Continuing Education: Past, Present, and Future

Laying the Groundwork

Continuing Professional Development: Early Groundwork

Contemporary Landscape of Continuing Professional Development

Continuing Education Worldwide

Redesigning Continuing Professional Development

Illustration of Effective Continuing Professional Development

Concluding Remarks

Questions from the Editors

References

12 Advances in Medical Education from Mastery Learning and Deliberate Practice

History of U.S. Medical Education

Mastery Learning with Deliberate Practice

Translational Outcomes from Medical Education

Coda

Questions from the Editors

References

Part IV: Recommendations

13 Improving Psychotherapy Outcomes: Guidelines for Making Psychotherapist Expertise Development Routine and Expected

Training Programs

Regulatory Boards and Agencies: Overseeing Initial and Ongoing Rights to Practice

Supervisors and Consultants

Administrators of Clinics, Agencies, and Mental Health Systems

Professional Associations

Research

Practicing Psychotherapists

References

Index

End User License Agreement

List of Tables

Chapter 1

Table 1.1 Comparison of routine performance, passive learning, and deliberate practice.

Table 1.2 Deliberate practice goals, settings, areas, and methods across the career span.

Chapter 3

Table 3.1 Effective therapists.

Chapter 4

Table 4.1 Traditional supervision or consultation versus simulation‐based behavioral rehearsal

Table 4.2 Examples of tasks assigned to trainees or therapists to achieve particular training goals.

List of Illustrations

Chapter 1

Figure 1.1 Cycle of Excellence.

Figure 1.2 Comparing the relationship between the hours of deliberate practice and improved performance for therapists and violinists.

Figure 1.3 Improved performance via deliberate practice.

Chapter 2

Figure 2.1 Four primary components of deliberate practice framework.

Figure 2.2 Three components of domain‐specific knowledge.

Figure 2.3 Cycle of overcoming automaticity.

Figure 2.4 Therapists grouped in quartiles based on their adjusted client outcomes as a function of estimated time spent on “deliberate practice alone” per typical workweek.

Figure 2.5 Normed progress trajectory for PCOMS.

Figure 2.6 Differentiation between performance and learning.

Figure 2.7 Mean scores based on subscales of the Facilitative Interpersonal Skills ratings across the five trials in difficult conversations in therapy.

Chapter 3

Figure 3.1 Contextual Model.

Chapter 4

Figure 4.1 Conceptual map of training goals and pathways.

Figure 4.2 The supervisor's or consultant's responsibilities in facilitating the Cycle of Excellence.

Chapter 6

Figure 6.1 Graphical representation of client outcome data.

Chapter 7

Figure 7.1 Plan Do Study Act logbook.

Figure 7.2 How the report compares services across the learning collaboration using funnel plots, which prevent overinterpretation of fluctuations in outcomes due to small data sets.

Figure 7.3 How to interpret a funnel plot: a worked example.

Figure 7.4 Change scores for the service for one particular outcome measure compared with those from the rest of CORC.

Figure 7.5 Service experience data.

Chapter 9

Figure 9.1 Model of Mastery.

Chapter 10

Figure 10.1 Clinical dashboard.

Chapter 12

Figure 12.1 Mastery learning with deliberate practice in medical education.

Figure 12.2 Mastery learning of lumbar puncture skills.

Figure 12.3 Simulation‐based medical education as translational science.

Guide

Cover

Table of Contents

Begin Reading

Pages

iii

iv

v

vii

viii

ix

x

xi

xii

xiii

xiv

1

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

47

49

50

51

52

53

54

55

56

57

58

59

60

61

62

63

64

65

67

68

69

70

71

72

73

74

75

76

77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

97

99

100

101

102

103

104

105

106

107

108

109

110

111

112

113

114

115

116

117

118

119

120

121

122

123

124

125

126

127

128

129

130

131

132

133

134

135

136

137

138

139

140

141

142

143

145

146

147

148

149

150

151

152

153

154

155

156

157

158

159

160

161

163

164

165

166

167

168

169

170

171

172

173

174

175

176

177

178

179

180

181

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

197

198

199

200

201

202

203

204

205

206

207

208

209

210

211

212

213

214

215

216

217

219

220

221

222

223

224

225

226

227

228

229

230

231

232

233

234

235

236

237

238

239

240

241

242

243

244

245

246

247

248

249

250

251

252

253

254

255

256

257

258

259

260

261

262

263

264

265

267

268

269

270

271

272

273

274

275

277

278

279

280

281

282

283

284

285

286

287

288

289

290

The Cycle of Excellence

Using Deliberate Practice to Improve Supervision and Training

 

Edited By

Tony Rousmaniere

Rodney K. Goodyear

Scott D. Miller

Bruce E. Wampold

 

 

 

 

 

This edition first published copyright 2017© 2017 John Wiley & Sons, Ltd.

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.

The right of Tony Rousmaniere, Rodney K. Goodyear, Scott D. Miller, and Bruce E. Wampold to be identified as the authors of the editorial material in this work has been asserted in accordance with law.

Registered OfficesJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USAJohn Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK

Editorial OfficeThe Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK

For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.

Wiley also publishes its books in a variety of electronic formats and by print-on-demand. Some content that appears in standard print versions of this book may not be available in other formats.

Limit of Liability/Disclaimer of Warranty

While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. It is sold on the understanding that the publisher is not engaged in rendering professional services and neither the publisher nor the authors shall be liable for damages arising herefrom. If professional advice or other expert assistance is required, the services of a competent professional should be sought.

Library of Congress Cataloging-in-Publication Data

Names: Rousmaniere, Tony, editor. | Goodyear, Rodney K., editor. | Miller, Scott D., editor. | Wampold, Bruce E., 1948- editor.Title: The Cycle of Excellence : Using Deliberate Practice to Improve Supervision and Training / [edited by] Tony Rousmaniere, Rodney K. Goodyear, Scott D. Miller, Bruce E. Wampold.Description: Chichester, UK ; Hoboken, NJ : John Wiley & Sons, 2017. | Includes index.Identifiers: LCCN 2017015214 (print) | LCCN 2016055347 (ebook) | ISBN 9781119165569 (Paperback) | ISBN 9781119165583 (Adobe PDF) | ISBN 9781119165576 (ePub)Subjects: LCSH: Career development. | Supervision of employees.Classification: LCC HF5549.5.C35 S86 2017 (ebook) | LCC HF5549.5.C35 (print) | DDC 616.89/14023—dc23LC record available at https://lccn.loc.gov/2017015214

Cover Design: WileyCover Image: © eugenesergeev/Gettyimages

Dedicated to therapists who strive to improve their results

About the Editors

Tony Rousmaniere, PsyD, is a psychologist in private practice in Seattle and a member of the clinical faculty at the University of Washington in Seattle, where he also maintains a private practice. He is the author of Deliberate Practice for Psychotherapists: A Guide to Improving Clinical Effectiveness and coeditor of Using Technology for Clinical Supervision: A Practical Handbook (American Counseling Association Press, 2015). Dr. Rousmaniere provides clinical training and supervision to therapists around the world, with an emphasis on using deliberate practice to improve the effectiveness of clinical skill development.

Rodney K. Goodyear, PhD, received his doctorate at the University of Illinois at Urbana‐Champaign. He is a professor at the University of Redlands in Redlands, California as well as emeritus professor of counseling psychology at the University of Southern California, and was the 2015 president of the Society for the Advancement of Psychotherapy. A major theme of his scholarship has been supervision and training of counselors and psychologists. Dr. Goodyear's book with Janine Bernard—Fundamentals of Clinical Supervision (Pearson, 2014)—is in its fifth edition and is arguably the most‐used supervision book in the world; he was a member of the American Psychological Association's task group that developed the APA's supervision guidelines; and he received the APA's 2015 award for Distinguished Lifetime Contributions to Education and Training.

Scott D. Miller, PhD, is the founder of the International Center for Clinical Excellence, an international consortium of clinicians, researchers, and educators dedicated to promoting excellence in behavioral health services. Dr. Miller conducts workshops and training in the United States and elsewhere, helping hundreds of agencies and organizations, both public and private, to achieve superior results. He also is one of a handful of invited faculty whose work, thinking, and research are featured at the prestigious Evolution of Psychotherapy Conference. His humorous and engaging presentation style and command of the research literature consistently inspire practitioners, administrators, and policy makers to make effective changes in service delivery.

Bruce E. Wampold, PhD, is professor emeritus of counseling psychology at the University of Wisconsin–Madison, director of the Research Institute at Modum Bad Psychiatric Center in Vikersund, Norway, and chief scientist of Theravue, an electronic platform for therapist consultation and improvement. He is a fellow of the American Psychological Association (Divisions 12, 17, 29, 45) and is board certified in counseling psychology by the American Board of Professional Psychology. He is the author of over 200 books, chapters, and articles related to counseling, psychotherapy, statistics, and research methods and is the recipient of the 2007 Distinguished Professional Contributions to Applied Research Award from the American Psychological Association and the Distinguished Research Career Award from the Society for Psychotherapy Research.

Currently Dr. Wampold's work involves understanding counseling and psychotherapy from empirical, historical, and anthropological perspectives. His pursuit of evidence on psychotherapy has led to the application and development of sophisticated statistical methods to understand the complexities of the field. He has contributed to various areas related to psychotherapy, including the relative efficacy of various approaches, therapist effects, the therapeutic alliance, placebo effects in medicine and in psychotherapy, trajectories of change, multicultural competence, and expertise in psychotherapy. His analysis of empirical evidence, which led to the development of a contextual model from which to understand the benefits of counseling and psychotherapy, is found in The Great Psychotherapy Debate: The Evidence for How Psychotherapy Works (with Z. Imel, Routledge, 2015).

List of Contributors

Robbie Babins‐Wagner, PhD, RSW, is the chief executive officer of Calgary Counselling Centre and an adjunct professor and sessional instructor with the Faculty of Social Work, University of Calgary, Calgary, Alberta, Canada. Her research interests focus on domestic abuse and psychotherapy outcomes in community‐based, nonprofit mental health services. Robbie is a sought‐after conference presenter, locally, provincially, nationally, and internationally.

Nicholas Bach, MA, is a clinical psychology student at Spalding University in Louisville, Kentucky. He has worked clinically in a private practice, public schools, a residential treatment facility, and a college counseling center. His research focuses on psychotherapy outcome, romantic relationships, religion and spirituality, and military active‐duty personnel and veterans.

Matt Barnard, MA Cantab, is the head of the Child Outcomes Research Consortium (CORC). Before joining CORC, Matt was head of evaluation at the NSPCC, where he led one of the largest‐ever programs of evaluation and learning in the children's sector.

Stephanie Winkeljohn Black, PhD, is an assistant professor of psychology in the Department of Psychology and Social Sciences at Penn State Harrisburg, Pennsylvania. Her area of research focuses on religious and spiritual behaviors and mental health across diverse groups and on trainees' cultural competency as it relates to religious and spiritual identities.

Jenny Bloxham, MA, is the communications and influencing manager at the Child Outcomes Research Consortium in London. She has a wealth of experience working for children's education and health charities in both the United Kingdom and elsewhere, including Save the Children, the International Catholic Migration Commission, and the UN Refugee Agency. Jenny holds an undergraduate degree in modern European studies and a master's in communications, new media, governance, and democracy.

Norah A. Chapman, PhD, is an assistant professor at Spading University in Louisville, Kentucky. Her primary research interests are in evaluating components of psychotherapy process and outcome, both in person and via telepsychology, to develop evidence‐based practices that increase access to and the quality of mental health care among underserved populations.

Daryl Chow, PhD, is a senior associate and certified trainer with the International Center for Clinical Excellence, where he conducts research on deliberate practice and professional development for psychotherapists. He is currently based in Western Australia, working with a group of vibrant private practitioners (Specialist Psychological Outreach Team [SPOT]) located in Fremantle, WA. He is a coeditor of and contributing author to the book The Write to Recovery: Personal Stories & Lessons About Recovery from Mental Health Concerns and is coauthor of Reach: Pushing Your Clinical Performance to the Next Level with Scott Miller, PhD (forthcoming).

Kate Dalzell, MA, is practice lead at the Child Outcomes Research Consortium (CORC) and head of innovation and dissemination at the Anna Freud National Centre for Children and Families, both located in London. Kate has worked in service development in local authority and health contexts for over 10 years, in particular in applying data‐driven approaches to embed a focus on outcomes and in supporting cross‐sector collaboration to address local needs.

Marc J. Diener, PhD, is an associate professor in the clinical psychology doctoral program at Long Island University Post, and he maintains a part‐time independent practice. His program of research examines personality assessment as well as psychotherapy process and outcome. His publications have focused on attachment, psychotherapy technique, psychotherapy outcome, supervision, application of meta‐analytic methodology, and self‐report and performance‐based measures of personality.

Joanna M. Drinane, MEd, is a doctoral candidate in counseling psychology at the University of Denver, Colorado. Her areas of interest include psychotherapy process and outcome research. More specifically, she studies therapist effects, multicultural orientation, mental health disparities, and the ways in which culture influences the therapeutic relationship.

Simon B. Goldberg, BA, is a doctoral candidate in counseling psychology at the University of Wisconsin–Madison and a psychology intern at the Veterans Affairs Puget Sound, Seattle Division. His research program is focused on common and specific factors at play in psychological interventions. He has a particular emphasis on mindfulness‐based interventions and quantitative research methods.

Mark J. Hilsenroth, PhD, is a professor of psychology at the Derner Institute of Advanced Psychological Studies at Adelphi University in Garden City, New York, and the primary investigator of the Adelphi University Psychotherapy Project. His areas of professional interest include personality assessment, training/supervision, psychotherapy process and treatment outcomes. In addition, he is currently editor of the American Psychological Association Division 29 journal Psychotherapy, and he maintains a part‐time clinical practice.

Mark A. Hubble, PhD, grew up near Baltimore, Maryland, bodysurfing the cold waters of the Atlantic. Currently he works as a psychologist and national consultant. An accomplished writer and editor, Mark has published numerous articles and is coauthor of The Heart and Soul of Change, Escape from Babel, Psychotherapy with “Impossible” Cases, and The Handbook of Solution‐Focused Brief Therapy.

Jenna Jacob, MSc, is the research lead for the Child Outcomes Research Consortium (CORC). Her particular research interests are in personalized care and outcomes for children and families, which includes goal setting and tracking as part of shared decision making.

Emma Karwatzki, D.Clin.Psy., is a clinical psychologist working in Hertfordshire, UK. She has worked as a clinician and supervisor in child mental health services for over 10 years and trains clinical psychologists.

Duncan Law, D.Clin.Psy., is a consultant clinical psychologist at the Anna Freud National Centre for Children and Families in London and director of MindMonkey Associates (www.mindmonkeyassociates.com). In addition, he is an honorary senior lecturer at University College London and a founder member of the Child Outcomes Research Consortium (CORC) in London.

Kate Martin, MA, is founder and director of Common Room Consulting Ltd, a consultancy led by lived experience, which connects the views and expertise of children, young people, researchers, and practitioners to promote collaborative practice across disability, health, and mental health.

William C. McGaghie, PhD, is professor of medical education and professor of preventive medicine at the Northwestern University Feinberg School of Medicine in Chicago. His area of research interest focuses on the use of medical simulation coupled with deliberate practice and mastery learning to produce translational medical education outcomes.

John McLeod, PhD, holds positions at the University of Oslo, Norway, and the Institute for Integrative Counselling and Psychotherapy, Dublin. He has extensive experience as a counselor, supervisor, trainer, and researcher. His many publications include these books: Personal and Professional Development for Counsellors, Psychotherapists and Mental Health Practitioners, published by Open University Press, and Using Research in Counselling and Psychotherapy, published by Sage.

Donald Meichenbaum, PhD, is distinguished professor emeritus, University of Waterloo, Ontario, Canada, and is currently research director of the Melissa Institute for Violence Prevention in Miami, FL (www.melissainstitute.com). He is one of the founders of cognitive behavioral therapy, and he specializes in trauma and resilience. (Please see www.roadmaptoresilience.org.)

Greg J. Neimeyer, PhD, is professor emeritus at the University of Florida, Gainesville, where he has served as a faculty member, director of training, and graduate coordinator. With over 200 publications in the areas of counseling and professional development, he has been recognized by the American Psychological Association with its Award for Outstanding Research in Career and Personality Psychology.

Jesse J. Owen, PhD, is an associate professor and chair of the Counseling Psychology Department at the University of Denver in Colorado. He is also a licensed psychologist and has a private practice in Denver. His research and practice interest includes psychotherapy process and outcome with a specific emphasis on multicultural processes and therapist expertise.

Benjamin Ritchie, MSc, is the lead of Child Outcomes Research Consortium (CORC) Informatics, which supports CORC's member services and central team in processing and managing large data sets. He has particular experience in the fields of data handling and information governance. His current work with partnerships of organizations in the health, education, and social care sectors aims to link data sources in order to allow service‐user outcomes to be considered from different perspectives.

Jennifer M. Taylor, PhD, is an assistant professor of counseling psychology and counseling at the University of Utah in Salt Lake City. Her research interests include professional competence, continuing education, lifelong learning, continuing professional development, and mentoring. She serves as the chair of the Continuing Education Committee for the American Psychological Association and is the coeditor of  Continuing Professional Development and Lifelong Learning: Issues, Impacts, and Outcomes (Nova Science, 2012).

Isabelle Whelan, MA, is a research editor with 10 years' experience working in research communication and international development.

Miranda Wolpert, D.Clin.Psy., is founder and director of the Child Outcomes Research Consortium (CORC), the UK's leading membership organization that collects and uses evidence to improve the mental health and well‐being of children and young people, and professor of evidence‐based practice and research at University College London. She is committed to understanding how best to support and evaluate effective service delivery to promote resilience and meet children's and young people's mental health needs.

Part IThe Cycle of Excellence

1Introduction

Tony Rousmaniere, Rodney K. Goodyear, Scott D. Miller, and Bruce E. Wampold

An ounce of practice is worth more than tons of preaching.

—Mahatma Gandhi

Over the past century, dramatic improvements in performance have been experienced in sports, medicine, science, and the arts. This is true, for example, in every Olympic sport (e.g., Lippi, Banfi, Favaloro, Rittweger, & Maffulli, 2008). College athletes in running, swimming, and diving perform better than gold medal winners from the early Olympic Games (Ericsson, 2006). In medicine, the number of diseases that can be treated effectively has steadily increased, while mortality from medical complications has decreased (Centers for Disease Control, 2012; Friedman & Forst, 2007). In mathematics, calculus that previously required decades to learn is now taught in a year of high school (Ericsson, 2006). In the arts, modern professional musicians routinely achieve or exceed technical skill that previously was attainable only by unique masters like Mozart (Lehmann & Ericsson, 1998).

Unfortunately, the same cannot be said of mental health treatment. Although the number and variety of psychotherapy models have grown rapidly, the actual effectiveness of psychotherapy has not experienced the dramatic improvements seen in the fields described (Miller, Hubble, Chow, & Seidel, 2013). For example, in modern clinical trials, cognitive behavioral therapy appears to be less effective than was demonstrated in the original trials from the 1970s (Johnsen & Friborg, 2015). That we have remained on this performance plateau is clearly not due to a lack of desire for improvement—virtually all mental health clinicians want to be more effective. So what have we been missing? How can we get better at helping our clients? In this book, we outline procedures that lead to increasing the effectiveness of psychotherapy.

The Overall Effectiveness of Psychotherapy

First, let's step back to examine the big picture concerning the effectiveness of psychotherapists. Good news: The consistent finding across decades of research is that, as a field, we successfully help our clients. Studies examining the effectiveness of clinicians working across the field, from community mental health centers, to university counseling centers, to independent practice, show that, on average, mental health clinicians produce significant positive change for their clients (Lambert, 2013; Wampold & Imel, 2015). The average psychologically distressed person who receives psychotherapy will be better off than 80% of the distressed people who do not (Hubble, Duncan, & Miller, 1999; Wampold & Imel, 2015). Dozens of studies show that the effects of psychotherapy and counseling are at least as large as the effects of psychotropic medications and that psychotherapy and counseling are less expensive, have fewer troubling side effects, and last longer (Forand, DeRubeis, & Amsterdam, 2013; Gotzsche, Young, & Crace, 2015).

Opportunity for Improvement

Although the big picture is positive, there is room for improvement. For example, in clinical trials, only 60% of clients achieve clinical “recovery,” and between 5% and 10% actually deteriorate during treatment (Lambert, 2013). The percentage of clients who terminate care prematurely falls between 20% and 60%, depending on how “prematurely” is defined (Swift, Greenberg, Whipple, & Kominiak, 2012), and these rates have remained largely unchanged for the past five decades.

Furthermore, there is considerable between‐clinician variability in effectiveness. Whereas the most effective therapists average 50% better client outcomes and 50% fewer dropouts than therapists in general (Miller et al., 2013), these “super shrinks” (Miller, Hubble, & Duncan, 2007) are counterbalanced by those therapists who produce, on average, no change or may even cause most of their clients to deteriorate (Baldwin & Imel, 2013; Kraus, Castonguay, Boswell, Nordberg, & Hayes, 2011; Wampold & Brown, 2005). So there is clear room for many therapists to demonstrably increase their effectiveness.

How, then, can clinicians become more effective? Some may assume that the best way to get better at something is simply to do it a lot. A significant body of research documents that musicians, chess players, and athletes, in the correct circumstances, improve with time and experience (at least up to the point of competency; Ericsson & Pool, 2016). However, psychotherapy is a field in which practitioners' proficiency does not automatically increase with experience (Tracey, Wampold, Goodyear, & Lichtenberg, 2015; Tracey, Wampold, Lichtenberg, & Goodyear, 2014). Two large studies have shown that “time in the saddle” itself does not automatically improve therapist effectiveness (Goldberg, Rousmaniere et al., 2016; Owen, Wampold, Rousmaniere, Kopta, & Miller, 2016). One of these studies, based on the outcomes of 173 therapists over a period of time up to 18 years, found considerable variance in the outcomes achieved by the therapists over time. Although some of the therapists were able to continually improve, client outcomes on average tended to decrease slightly as the therapists gained more experience (Goldberg, Rousmaniere et al., 2016). Another study examined the change in outcomes of 114 trainees over an average of 45 months. As in the Goldberg, Rousmaniere et al. (2016) study, in the Owen et al. (2016) study, there was considerable variance in the outcomes achieved by trainees over time. Although trainees, on average, demonstrated small‐size growth in outcomes over time, this growth was moderated by client severity, and some trainees demonstrated worse outcomes over time, leading the authors to observe that “trainees appear to have various trajectories in their ability to foster positive client outcomes over time, and at times not a positive trajectory” (p. 21).

Current Strategies for Improving Effectiveness

What accounts for the failure to improve? Answering that question requires first looking at the four most widely used methods for improving therapist effectiveness: supervision, continuing education (CE), the dissemination of evidence‐based treatments, and outcome feedback systems.

Supervision provides trainees with important professional preparation. For example, supervision has been shown to provide basic helping skills, improve trainees' feelings about themselves as therapists and understanding about being a therapist, and enhance trainees' ability to create and maintain stronger therapeutic alliances, the component of therapy most associated with positive outcomes (e.g., Hill et al., 2015; Hilsenroth, Kivlighan, & Slavin‐Mulford, 2015; Wampold & Imel, 2015). However, evidence concerning the impact of supervision—as it has been practiced—on improving client outcomes is mixed at best (Bernard & Goodyear, 2014; Rousmaniere, Swift, Babins‐Wagner, Whipple, & Berzins, 2016). Indeed, prominent supervision scholars (e.g., Beutler & Howard, 2003; Ladany, 2007) have questioned the extent to which supervision improves clinical outcomes. Summarizing the research in this area, Watkins (2011) reported, “[W]e do not seem to be any more able now, as opposed to 30 years ago, to say that supervision leads to better outcomes for clients” (p. 252).

Continuing education (CE) (“further education” in the United Kingdom) is a second method for improving, or at least maintaining, therapist effectiveness. Many jurisdictions require CE to maintain licensure, certification, or registration necessary for practice. CE is commonly delivered via a passive‐learning format, such as lecture or video (perhaps with some discussion). This format may be effective at imparting knowledge about particular topics (laws, ethics, new treatments, etc.), but typically it includes little interactive practice or corrective feedback for participants and thus has questionable impact on actual skill development. Research from CE in medicine has demonstrated that passive‐learning formats have “little or no beneficial effect in changing physician practice” (Bloom, 2005, p. 380). Summarizing concerns about the limits of CE, Neimeyer and Taylor (2010) reported, “A central concern follows from the field's failure to produce reliable evidence that CE translates into discernibly superior psychotherapy or outcomes, which serves as  the cornerstone of the warrant underlying CE and its related commitment to the welfare of the consumer” (p. 668).

A third prominent method for improving therapist effectiveness that has gained considerable momentum over the past half century is the dissemination of evidence‐based treatments (EBTs, also called empirically supported treatments or psychological treatments with research support). Using EBTs to improve the quality of mental health care is based on a two‐step process: (a) clinical trials are used to determine which specific therapy models are effective for treating specific psychiatric disorders, and (b) these models are disseminated by training therapists to be competent in the EBTs. Over the years, hundreds of EBTs have been tested in clinical trials for an ever‐increasing range of disorders, and the results of these trials commonly show EBTs to be more effective than no treatment. However, there is a paucity of evidence that becoming competent in EBTs improves the effectiveness of individual therapists in actual practice (Laska, Gurman, & Wampold, 2014). For example, Branson, Shafran, and Myles (2015) found no relationship between cognitive behavioral therapy competence and patient outcome. In fact, large studies frequently show that clinicians in general practice achieve the same outcomes as those deemed competent in clinical trials (Wampold & Imel, 2015). In a meta‐analysis of clinical trials comparing an EBT to a treatment‐as‐usual condition, Wampold et al. (2011) showed that when treatment as usual involved legitimate psychotherapy, the outcomes of treatment as usual and EBT were not statistically different. Notably, clinical trials often show more variability in outcomes among clinicians than between treatments, suggesting that more attention is needed to skill acquisition by individual clinicians (based on their personal clients' outcome data) across all treatment models (Baldwin & Imel, 2013; Miller et al., 2007; Wampold & Imel, 2015). In summary, competence in evidence‐based treatment models does not appear to be itself sufficient for improving the effectiveness of psychotherapy by individual clinicians in actual practice.

A fourth method for improving therapist effectiveness that has been increasingly adopted over the past two decades is feedback systems, also called practice‐based evidence, in which clinicians monitor their clients' progress by examining outcome data session to session. Feedback systems have been shown to improve the quality of psychotherapy, in part by identifying and preventing failing cases (Lambert & Shimokawa, 2011). In fact, two feedback systems—the Partners for Change Outcome Management System (PCMOS, 2013) and OQ‐Analyst—have such a powerful impact on client outcome that they are now considered an “evidence‐based practice” by the Substance Abuse and Mental Health Services Administration. However, feedback systems have not been shown to lead to the development of clinical expertise for individual therapists (Miller et al., 2013; Tracey et al., 2014). That is, although therapists who receive feedback about particular clients can alter the treatment for those particular clients, receiving the feedback does not appear to reliably generalize to other cases or improve therapists' overall clinical skills.

Each of these methods for professional improvement has clear value. However, despite the attention that has been given to strengthening supervision and training (American Psychological Association, 2015), CE (Wise et al., 2010), the dissemination of empirically based treatments (McHugh & Barlow, 2010), and routine clinical feedback (Lambert, 2010), overall psychotherapy outcomes have not improved over the past 40 years (Miller et al., 2013). Simply put, our field has lacked a successful model for therapist skill advancement. So, we return to our question: How can clinicians become more effective? To help answer this question, let's look beyond our field and see what we can learn from others.

The Science of Expertise

During the past two decades, a growing body of research has examined the methods professionals use to attain expertise (e.g., Ericsson, 1996, 2009). The science of expertise has been concerned with identifying how professionals across a wide range of fields—from musicians, to chess players, to athletes, to surgeons—move from average to superior performance. The findings confirm results cited earlier regarding the development of expertise in psychotherapy: Simply accumulating work experience does not itself lead to expert performance (Ericsson, 2006). Rather, researchers have identified a universal set of processes that accounts for the development of expertise as well as a step‐by‐step process that can be followed to improve performance within a particular discipline (Ericsson, Charness, Feltovich, & Hoffman, 2006).

The Cycle of Excellence

Informed by findings reported by researchers (Ericsson, 1996, 2009; Ericsson, Charness, Feltovich, & Hoffman, 2006; Ericsson, Krampe, & Tesch‐Romer, 1993) and writers (Colvin, 2008; Coyle, 2009; Shenk, 2010; Syed, 2010) on the subject of expertise, Miller et al. (2007) identified three components critical for superior performance. Working in tandem to create a “cycle of excellence,” these components include:

Determining a baseline level of effectiveness, including strengths and skills that need improvement;

Obtaining systematic, ongoing, formal feedback; and

Engaging in deliberate practice. (See

Figure 1.1

.)

A brief description of each step follows.

Figure 1.1 Cycle of Excellence.

In order to improve, it is essential to know how well one fares in a given practice domain, including strengths and skills that need improvement. Top performers, research shows, are constantly comparing what they do to their own “personal best,” the performance of others, and existing standards or baselines (Ericsson, 2006). As reviewed, in the realm of psychotherapy, numerous well‐established outcome measurement systems are available to clinicians for assessing their baseline (Miller et al., 2013). Each of these systems provides therapists with real‐time comparisons of their results with national and international norms (Lambert, 2010; Miller, Duncan, Sorrell, & Brown, 2005). Specific clinical strengths and skills that need improvement can be identified by supervisors, trainers, or peers, depending on the developmental level of the therapist.

The second element in the Cycle of Excellence is obtaining formal, ongoing feedback. Feedback comes from two sources: (a) empirical outcome measures and (b) coaches and teachers—in psychotherapy, these often are referred to as supervisors—whose job it is to identify the skills that need to be developed and provide specific suggestions and training experiences specifically designed to enhance the individual's performance. High‐level performers, it turns out, both seek out and have more access to such mentoring from recognized experts (Hunt, 2006). As discussed earlier, research has shown that ongoing feedback from supervisors can improve trainees' clinical skills, such as the ability to build a strong therapeutic working alliance (e.g., Hill et al., 2015; Hilsenroth, Ackerman, Clemence, Strassle, & Handler, 2015).

Although feedback is necessary for improvement, it is not itself sufficient. Creating a Cycle of Excellence requires an additional essential step: engaging in deliberate practice (Ericsson, 2006). Briefly, this type of practice is focused, systematic, and carried out over extended periods of time. Generally, it involves identifying where one's performance falls short, seeking guidance from recognized experts, setting aside time for reflecting on feedback received, and then developing, rehearsing, executing, and evaluating a plan for improvement (Ericsson, 1996, 2006; Ericsson et al., 1993). Deliberate practice involves a tight focus on repetitively practicing specific skills until they become routine. Because it requires sustained concentration and continuous corrective feedback outside the trainee's comfort zone, deliberate practice typically is not enjoyable or immediately rewarding (Coughlan, Williams, McRobert, & Ford, 2013; Ericsson & Pool, 2016). Deliberate practice intentionally causes a manageable level of strain to stimulate growth and adaptation: “[E]lite performers search continuously for optimal training activities, with the most effective duration and intensity, that will appropriately strain the targeted physiological system to induce further adaptation without causing overuse and injury” (Ericsson, 2006, p. 12). For these reasons, deliberate practice is distinctly different from the two activities most common for therapists: routine performance and passive learning, as illustrated in Table 1.1.

Table 1.1 Comparison of routine performance, passive learning, and deliberate practice.

Activity

Definition

Examples

Goal

Characteristics

Routine performance

Simply performing work as usual

Providing therapy

To earn an income by providing a service

Often feels enjoyable and immediately rewarding

Passive learning

Learning without a practice and feedback component

Attending lectures Reading about psychotherapy models

To build general knowledge about models, theories, and skills

May be enjoyable and feel immediately rewarding

Deliberate practice

Repetitively practicing specific skills with continuous corrective feedback

Reviewing videos of therapy sessions with expert providing feedback Repeatedly role‐playing solutions to mistakes made in videotaped sessions

To address knowledge deficits specific to therapist; works exactly at therapist's performance threshold; makes specific skills routine and automatic by moving performance into procedural memory

Feels challenging and hard; not inherently enjoyable or immediately rewarding

How Much Practice Is Enough?

Elite performers across many different domains, including professional musicians, athletes, and chess players, devote hours to deliberate practice every day, often including weekends (Ericsson, 1996, 2006; Ericsson et al., 1993). Researchers have found that achieving expert performance does not just take a few years of training but rather requires much more effort—thousands of hours of deliberate practice, often requiring 10 to 30 years of sustained effort and focus (Ericsson, 2006). Furthermore, research indicates that continued deliberate practice throughout the career span is required for maintenance of expert performance (Ericsson, 2006).

The concept of the “10,000‐hour” or “10‐year rule” was brought to popular awareness by the book Outliers (Gladwell, 2008), referring to the amount of time necessary to become an expert in a field. (Research actually has found that the number of hours required for mastery varies by field; Ericsson & Pool, 2016.) However, a common misconception is that thousands of hours of routine work experience lead to expert performance. In contrast, researchers have found something much more challenging: Thousands of hours of deliberate practice, on top of hours spent in routine work performance, usually are required for expert performance.

Could the same process apply to mental health professionals? Chow, Miller, Seidel, Kane, and Andrews (2015) recently examined this question by surveying a group of therapists about the amount of time and effort they dedicated to deliberate practice. Their findings are strikingly similar to what expertise researchers discovered about other fields: Highly effective therapists devoted 4.5 times more hours to activities specifically designed to improve their effectiveness than less effective therapists (Chow et al., 2015). Figure 1.2 compares the findings about therapists from Chow et al. (2015) with the findings from a similar study about violinists (Ericsson et al., 1993).

Figure 1.2 Comparing the relationship between the hours of deliberate practice and improved performance for therapists and violinists.

Sources: Chow et al. (2015, p. 342) and Ericsson et al. (1993, p. 379).

Unfortunately, to date, professional training programs have encouraged deliberate practice to a very limited extent, despite the recognition that training should be “sequential, cumulative and graded in complexity” (Commission on Accreditation, 2013, p. 7). Opportunities to engage in deliberate practice become even fewer once clinicians complete their training. For most therapists, a serious focus on skill acquisition ends at the beginning of their career, right after graduate school. As seen in Figure 1.3, performance of the typical therapist does not improve through the professional career (i.e., after professional training), a result supported by longitudinal study of therapist outcomes (Goldberg, Rousmaniere et  al., 2016). It appears that students in graduate training acquire skills (e.g., Hill et al., 2015) and improve their outcomes over the course of training, although the improvement in outcomes may be quite gradual and not consistent (Owen et al., 2016). It is worth noting that even for domains where expertise is clearly visible (e.g., musicians, athletes, chess players), few achieve a level recognized as expert. Many of us are passably good musicians (we might sing or play guitar at gatherings or religious services), but we are clearly not in an elite group. Those who are elite, regardless of natural talent, have engaged in deliberate practice.

Figure 1.3 Improved performance via deliberate practice.

Bringing the Science of Expertise to Psychotherapy

Our goal for this book is to bring the science of expertise to the field of mental health. We do this by proposing a model for using the Cycle of Excellence throughout therapists' careers, from supervised training to independent practice.

Stage 1: Deliberate Practice in Supervised Training

The first major stage of clinicians' careers is intensive formal training, with the goal of achieving professional competency. Trainees in this stage work under supervision. Supervision, one of the four methods of development discussed earlier, is a relationship in which a more senior clinician monitors and guides a trainee's work in order both to facilitate trainee development and to ensure quality of client care (American Psychological Association, 2015; Bernard & Goodyear, 2014). Supervision provides a strong yet flexible relationship in which a seasoned expert can identify errors and the skills necessary for improvement, on a case‐by‐case basis. Supervisors can provide the essential ingredients for deliberate practice (McMahan, 2014) by:

Explaining and demonstrating models for effective practice (e.g., cognitive behavioral therapy or psychodynamic psychotherapy);

Determining each therapist's zone of proximal development (i.e., their exact threshold of understanding and opportunity for improvement);

Providing corrective feedback and guidance in style that is congruent and accessible to the learner;

Offering emotional encouragement to boost the learner's morale and buffer against the emotional challenges inherent in deliberate practice (Duckworth, Kirby, Tsukayama, Berstein, & Ericsson, 2011); and

Teaching trainees how to work appropriately within various professional domains (clinical, legal, administrative, etc.).

During their first few years of graduate school, trainees are not only learning their craft but also being socialized into the culture of their field. Supervision is the perfect opportunity to instill the habits and attitudes necessary for a “culture of expertise” that will help clinicians use deliberate practice throughout their careers.

Stage 2: Deliberate Practice in Independent Practice

After clinicians complete their formal training and become licensed, they move into the second (and final) major stage of their career: independent practice. At this point, they become responsible for their own learning, which generally can be of several types (Lichtenberg & Goodyear, 2012): incidental learning (i.e., spontaneous, unplanned learning that might occur through, e.g., reviewing a manuscript or hearing a radio interview with an expert); CE experiences; and intentional, self‐directed learning. Deliberate practice concerns that third type of learning and has the goals of maintaining competency and gradually developing mastery of the craft. The mechanisms to support deliberate practice are varied in this stage, and include:

advanced training with experts,

skill assessment and case consultation with experts or peers, and

solo study (e.g., watching videotapes of one's own work).

Table 1.2 describes the different goals, settings, areas of focus, and methods of deliberate practice for each career stage.

Table 1.2 Deliberate practice goals, settings, areas, and methods across the career span.

Deliberate Practice

Goals

Settings

Areas of Focus

Methods

Career Stage 1: Supervised Training

Achieve professional competency

Under supervision

Attain competency in all basic skills

Videotape review, clinical role‐plays, assigned homework, etc.

Career Stage 2: Independent Practice

Assess skills, maintain competency, develop expertise, leading to mastery of craft

In consultation with experts, peers, and solo study

Develop advanced skills in areas of specialty Address specific deficiencies

Videotape review with experts, peers, and by oneself Advanced training with experts, self‐study, etc.

Sources of Motivation to Engage in Deliberate Practice

Students enter training programs in the mental health professions with excitement. They are highly motivated to seek and capitalize on learning opportunities. But as Stoltenberg and McNeill (2010) have discussed, students' motivation fluctuates across time. It is our impression that most clinicians remain intellectually curious throughout their professional lives but, once they attain basic competence, the curiosity is manifest more in diffuse ways than in focused ways. As discussed, deliberate practice is hard work, and learners typically find it both challenging and inherently unpleasant (Duckworth et al., 2011; Ericsson, 2006).

Researchers have identified a subset of very high‐achieving therapists who do engage in deliberate practice (Miller et al., 2007, 2013). They demonstrate grit, which is “perseverance and passion for long‐term goals” (Duckworth, Peterson, Matthews, & Kelly, 2007, p. 1087) and have “the capacity to stay committed to a challenging, far‐off, but ‘sweet’ goal” (Duckworth et al., 2011, p. 174). Indeed, Duckworth et al. (2011) found that level of grit predicted the extent to which spelling bee competitors engaged in deliberate practice and, in turn, how they performed. Given the challenges of sustaining internal motivation to engage in the deliberate practice necessary to develop expertise, both institutional support and effective mechanisms of accountability are essential to encouraging it (Goodyear, 2015). This is especially true for licensed clinicians who can be tempted to “coast” instead of engaging in ongoing deliberate practice. Recent research at a community mental health center in Canada has shown that agency‐wide support for deliberate practice, led by senior management, can improve client outcomes (Goldberg, Babins‐Wagner et al., 2016). In this book, we describe evidence‐based methods that treatment centers can use to support clinicians’ engagement in the Cycle of Excellence.

About This Book

The goals of this book are to provide clinicians and clinical supervisors with (a) the theory of deliberate practice and the Cycle of Excellence, (b) a new model to integrate deliberate practice into clinical training and independent practice, and (c) case examples of how deliberate practice is being used across a range of psychotherapy settings. This book is organized into four parts.

Part I: The Cycle of Excellence reviews the science of clinical outcomes, expertise, and supervision and proposes a new model for integrating deliberate practice into clinical practice at every stage of a career, from supervised training to independent practice.

Part II: Tracking Performance focuses on an essential ingredient of deliberate practice: empirically tracking therapist effectiveness. In the field of mental health, this means measuring client outcome, which has the full richness and complexity of the human experience. The chapters in this part describe accessible methods supervisors and clinicians can use to track client outcomes at the case, therapist, and agency levels, using both quantitative and qualitative methods.

Part III: Applications for Integrating Deliberate Practice into Supervision explores innovative programs for using deliberate practice to enhance psychotherapy training across a broad spectrum of areas, including psychodynamic psychotherapy, cognitive behavioral therapy, agency‐level improvement, and CE. This part also includes a chapter that describes how deliberate practice has been integrated into medical education, presented as a model and learning opportunity for the field of mental health.

Part IV: Recommendations concludes the volume by pulling together the previous chapters and proposing steps that can be taken to contribute to the mission of improving psychotherapeutic expertise.

Questions from John Norcross, PhD

For each chapter in this volume, we editors have posed several questions to the authors that a critical reader might ask. Answers by the chapter authors appear at the end of each chapter. For those chapters in which one or two of us were authors, others of our team took the role of asking challenging questions.

Because the four of us all were authors of this chapter, we reached outside the team and asked John Norcross, a prominent psychotherapy researcher and trainer, to pose the questions to us. In his characteristic way, he asked questions that were both insightful and rigorous.

Question #1. There is yet but a single research study attesting to the effectiveness of deliberate practice among psychotherapists in routine care. You review the research literature on the value of deliberate practice among other professionals, but those professions are notable for working by themselves and with inanimate objects (e.g., chess pieces, musical instruments), without the reciprocal influence of a client/patient. How do you respond to those who argue that you are recommending a practice (and writing an entire book) well beyond the supportive research evidence with psychotherapists?

Answer from Editors: We wholeheartedly agree with this question's underlying implication that clinical supervision and training methods should be subject to rigorous empirical testing. Indeed, we are arguing for a stance of empirical skepticism toward the effectiveness of all methods of clinical training, old and new. Too many of the field's current supervision practices are in wide use because they have been handed down via tradition rather than having been intentionally adopted on the basis of the research evidence (e.g., Ellis & Ladany, 1997).

In this volume, we are proposing that clinical supervision, training, and CE be reformed along the principles of deliberate practice. This marks a significant departure from the current approaches to clinical supervision and training. For example, we propose (a) to evaluate clinical supervision and training by the impact on client outcomes (rather than adherence and competence in a treatment model); (b) to emphasize active learning methods, such as repetitive behavioral rehearsal of clinical skills via role‐plays with corrective feedback (rather than discussions about psychotherapy theory); and (c) that clinicians receive personal performance feedback continuously throughout their career (rather than stopping when they are licensed).

The question of whether these principles that have been shown to improve performance across a range of fields apply as well to the practice of psychotherapy is valid. Psychotherapy is a unique pursuit by virtue of its interpersonal context and demands. When we cite evidence from other fields such as music, athletics, or medicine, our goal is to focus on the learning processes rather than any implied similarities between psychotherapy and the functions of those other fields, to make the case that the principles of deliberate practice improve skill acquisition apply across a wide variety of fields and tasks. (For example, Zen Buddhism and other spiritual traditions have relied on deliberate practice for millennia.) Each of these fields is unique, and each has developed its own specific methods of deliberate practice to specifically address its particular pedagogic challenges.

All these fields rely on a human being having learned a particular skill or set of skills. The large body of research that forms the science of expertise identifies principles that improve the effectiveness of human skill acquisition, and we argue this research applies to psychotherapy, including the development of necessary interpersonal skills (e.g., Anderson, McClintock, Himawan, Song, & Patterson, 2015; see Chapter 3).

In short, our primary concern is with new principles of supervision and training. The methods we suggest for implementing these principles are largely drawn from the research evidence (directly or as extrapolations). The next task for our field, though, is for researchers and clinicians to develop new methods of supervision and training, based on these principles, and then subject them to rigorous empirical testing and evaluation in both clinical labs and actual practice.

Question #2. Your “cycle of excellence” bears strong resemblance to other, well‐established models of active learning, such as that by David Kolb. What distinguishes your cycle from those of others, and what specific research support does your model enjoy?

Answer from Editors: This is an excellent question. And because others likely will wonder about it as well, we welcome the chance to address it. At the heart of the Cycle of Excellence model is the assumption that people learn from observing and critiquing their work. This same assumption has informed training since at least the time of Dewey (1938). In fact, other prominent models, such as those of Kolb (e.g., Kolb & Fry, 1975) and Schön (1988), owe a huge intellectual debt to Dewey's observations on the role of experience.

But the Cycle of Excellence differs from these models in at least two fundamental ways. The first of these concerns the essential role that a coach or supervisor has in providing feedback and direct instruction. This is in contrast with discovery learning, which so often is assumed to be common to models such as that of Kolb or Schön. Whereas discovery learning has an intuitive appeal, Kirschner, Sweller, and Clark (2006) offered a scathing critique of its effectiveness.

The second fundamental difference is in the role that intentional practice is assumed to play in skill development. Those models stress the cognitive processes that lead to new understandings about therapists' work and how they then might modify what they do. The models do not, though, focus on the hard work of really practicing and consolidating skills that lead to effective psychotherapy practice.

What we are proposing is that the Cycle of Excellence and deliberate practice emphasize maximizing opportunities for behavioral rehearsal and continuous corrective feedback. The goal is to give trainees ample opportunity to experiment with specific skills, so they can fail and get correction many times before trying the skills with real clients. This can be accomplished through the use of role‐plays and other behavioral training drills in supervision. To illustrate, our model aims to be more like how one learns to drive (behavioral rehearsal with continuous corrective feedback), while traditional supervision is more like how one learns philosophy (discussing theory). These two models are not mutually exclusive; trainees, of course, need to learn psychotherapy theory. Rather, we are suggesting that behavioral rehearsal with continuous corrective feedback be given a stronger emphasis within the supervision hour than it has been given previously.

Question #3. Many researchers have indeed questioned the value‐added benefit of clinical supervision on client outcomes. At the same time, the supervision research and supervision guidelines have converged on a series of best practices, which certainly contain the recommendations for supervision advanced in your chapter. Can you explain how your recommendations (e.g., demonstrate effective practice, provide corrective feedback, offer emotional encouragement) differ from those generic best practices?

Answer from Editors: The supervision “best practices” that various professions have developed (e.g., American Psychological Association, 2014; Association for Counselor Education and Supervision, 2011) represent the wisdom of supervision experts and stand as expected supervision competencies. It is inevitable, then, that we would incorporate them in our model as minimal expectations for effective supervisory practice. To illustrate: All of the available supervision guidelines stress the importance of supervisors directly observing their supervisees' work with clients in contrast to the current dominant practice of relying on supervisees' self‐reports of their work. Directly observing supervisees' work is imperative in our model, which assumes that expertise development is only as effective as the feedback available to guide it. But our model is not limited to that type of feedback and requires, for example, the use of information from routine outcome monitoring. In this way, our model goes beyond expected best practices.

This question also implicitly raises the important distinction between competence and expertise. Competence, as either a supervisor or a psychotherapist, is about performing work in an expected way. When professionals are held accountable for competence, they are responsible for what Lerner and Tetlock (2003) describe as process accountability (see also Goodyear, 2015): To what extent is this person executing a skill set as expected, regardless of the obtained outcomes?

But in our model, expertise development is assessed in terms of client outcomes. Therapists and supervisors are held to outcome accountability (Lerner & Tetlock, 2003): To what extent is this person achieving intended outcomes, regardless of how she or he performed? Both forms of accountability are important, although we give outcome accountability the greater weight: Process should not overshadow outcome. This is especially true as one does not predict the other (see, e.g., Webb, DeRubeis, & Barber, 2010).

This book is concerned primarily with ways that psychotherapists can move from competence to expertise. Although we give it less emphasis in the book, we assume as well that supervisors would commit to a similar process of development and that the measure of their evolving expertise would be client outcomes, just as it is with therapists.

Question #4: Figure 2.3 in Chapter 2 shows huge growth in performance among “experts” using deliberate practice. Is this figure based on an actual study, in which the experts achieved twice the performance/client outcomes of “typical therapists”? Or perhaps the figure just represents a conceptual promise?

Answer from Editors: As the growing research on implementation science makes clear, all research offers a “conceptual promise.” There is no guarantee of results. Differences in contexts, clients, management, and providers make any simple transfer and application of research findings to real‐world clinical settings challenging in the best of circumstances. More, as evidenced by the study reported in Figure 2.3, deliberate practice is hard work, the gains are slow in coming and not immediately rewarding. A further threat is that practitioners with average results already consider themselves as effective as the best while devoting significantly less time to efforts aimed at improving their outcomes. At the same time, this same graph, together with findings from other studies cited, shows that the most effective therapists engage in significantly more deliberate practice than their more average counterparts. Indeed, with clients of therapists in the top quartile achieving outcomes more than twice the size of those in the bottom (Wampold & Brown, 2005), deliberate practice serves to benefit both clinicians and recipients of mental health services.

In sum, the promise of deliberate practice depicted in Figure 2.3 is indeed real. However, its realization depends on two critical factors: (a) continuous effort and (b) long‐term commitment. Although a number of studies are in the works, knowledge regarding the application of deliberate practice to improving performance as a psychotherapist is still very much in its infancy. Evidence regarding what is required for helping practitioners sustain the commitment necessary to realize the gains is, for all points and purposes, unavailable. Until a time in the future, practitioners and the field will be reliant on studies from other domains (e.g., sport, music, medicine, computer programming, teaching).

References

Anderson, T., McClintock, A. S., Himawan, L., Song, X., & Patterson, C. L. (2015). A prospective study of therapist facilitative interpersonal skills as a predictor of treatment outcome.