61,99 €
Nursing and Health Interventions covers the conceptual, empirical, and practical knowledge required for engaging in intervention research. This revised edition provides step-by-step guidance on the complex process of intervention development and methods for developing, delivering, evaluating and implementing intervention, supported by a wealth of examples. The text describes each essential aspect of intervention research, from generating an intervention theory, to procedures for adopting evidence-based interventions in practice.
This second edition provides up-to-date coverage of intervention research and its impact on improving standards of care. Throughout the text, readers are provided with the foundational knowledge required for generating evidence that informs treatment decisions in practice, and choosing the best approaches for designing, delivering, evaluating and implementing interventions. A valuable ‘one-stop’ resource for students, researchers, and health professionals alike, this book:
Written by leading experts in the field, Nursing and Health Interventions remains an invaluable resource for nursing and healthcare students, researchers, and health practitioners wanting to understand and apply intervention to improve the quality of care.
Sie lesen das E-Book in den Legimi-Apps auf:
Seitenzahl: 1035
Veröffentlichungsjahr: 2021
Cover
Title Page
Copyright Page
Preface
Acknowledgments
SECTION I: INTRODUCTION
CHAPTER 1: Introduction to Intervention Research
1.1 TREATMENT DECISION‐MAKING
1.2 EVIDENCE‐BASED PRACTICE
1.3 CLIENT‐CENTERED CARE
1.4 COMPLEXITY OF THE REAL WORLD
1.5 CLIENT ENGAGEMENT IN INTERVENTION RESEARCH
1.6 ADVANCES IN INTERVENTION RESEARCH METHODS
1.7 PROCESS FOR DESIGNING, EVALUATING, AND IMPLEMENTING INTERVENTIONS
REFERENCES
CHAPTER 2: Overview of Interventions
2.1 DEFINITION OF INTERVENTIONS
2.2 INTERVENTION ELEMENTS
2.3 CHARACTERISTICS OF INTERVENTIONS
REFERENCES
SECTION II: DEVELOPING INTERVENTIONS
CHAPTER 3: Understanding Health Problems
3.1 IMPORTANCE OF UNDERSTANDING HEALTH PROBLEMS
3.2 THEORY OF THE PROBLEM
3.3 APPROACHES FOR GENERATING THEORY OF THE HEALTH PROBLEM
REFERENCES
CHAPTER 4: Designing Interventions
4.1 PROCESS FOR INTERVENTION DESIGN
4.2 APPROACHES FOR DELINEATING THE INTERVENTION’S ACTIVE INGREDIENTS
4.3 THEORY OF IMPLEMENTATION
4.4 THEORY OF CHANGE
4.5 DESIGNING TAILORED INTERVENTIONS
REFERENCES
CHAPTER 5: Intervention Theory
5.1 INTERVENTION THEORY
5.2 IMPORTANCE OF THE INTERVENTION THEORY
REFERENCES
SECTION III: DELIVERING INTERVENTIONS
CHAPTER 6: Overview of Intervention Delivery
6.1 VARIATIONS IN INTERVENTION DELIVERY
6.2 IMPACT OF VARIATIONS IN INTERVENTION DELIVERY
6.3 INTERVENTION FIDELITY
6.4 STRATEGIES TO ENHANCE FIDELITY
6.5 FIDELITY—ADAPTATION DEBATE
REFERENCES
CHAPTER 7: Development of Intervention Manual
7.1 APPROACH FOR DEVELOPING THE INTERVENTION MANUAL
7.2 CONTENT OF AN INTERVENTION MANUAL
7.3 USE OF THE INTERVENTION MANUAL
REFERENCES
CHAPTER 8: Selecting, Training, and Addressing the Influence of Interventionists
8.1 ROLE OF INTERVENTIONISTS
8.2 INFLUENCE OF INTERVENTIONISTS
8.3 SELECTION OF INTERVENTIONISTS
8.4 TRAINING OF INTERVENTIONISTS
8.5 INVESTIGATING INTERVENTIONIST EFFECTS
REFERENCES
CHAPTER 9: Assessment of Fidelity
9.1 CONCEPTUALIZATION OF FIDELITY
9.2 STRATEGIES AND METHODS FOR ASSESSING THEORETICAL FIDELITY
9.3 STRATEGIES AND METHODS FOR ASSESSING OPERATIONAL FIDELITY
REFERENCES
SECTION IV: EVALUATION OF INTERVENTIONS
CHAPTER 10: Overview of Evaluation of Interventions
10.1 NOTION OF CAUSALITY
10.2 VALIDITY
10.3 PHASES FOR INTERVENTION EVALUATION
REFERENCES
CHAPTER 11: Examination of Interventions’ Acceptance
11.1 FORMULATION OF INTERVENTION ACCEPTANCE
11.2 CONTRIBUTION OF PERCEIVED ACCEPTANCE TO VALIDITY
11.3 EXAMINATION OF ACCEPTABILITY
11.4 EXAMINATION OF PREFERENCES
11.5 EXAMINATION OF CREDIBILITY
11.6 EXAMINATION OF EXPECTANCY
11.7 EXAMINATION OF SATISFACTION WITH TREATMENT
REFERENCES
CHAPTER 12: Examination of Feasibility: Intervention and Research Methods
12.1 TERMS REFLECTING PRELIMINARY STUDIES
12.2 FEASIBILITY OF INTERVENTIONS
12.3 FEASIBILITY OF RESEARCH METHODS
12.4 INTERPRETATION OF OUTCOME FINDINGS
REFERENCES
CHAPTER 13: Process Evaluation
13.1 IMPORTANCE OF PROCESS EVALUATION
13.2 DEFINITION AND ELEMENTS OF PROCESS EVALUATION
13.3 METHODS USED IN PROCESS EVALUATION
13.4 ANALYSIS OF PROCESS DATA
REFERENCES
CHAPTER 14: Outcome Evaluation: Designs
14.1 TRADITIONAL RCT DESIGN
14.2 LIMITATIONS OF THE TRADITIONAL RCT DESIGN
14.3 ALTERNATIVE DESIGNS
14.4 DESIGN SELECTION
REFERENCES
CHAPTER 15: Outcome Evaluation: Methods
15.1 COMPARISON TREATMENT
15.2 SAMPLING
15.3 TREATMENT ALLOCATION
15.4 OUTCOME DATA COLLECTION
15.5 OUTCOME DATA ANALYSIS
REFERENCES
SECTION V: IMPLEMENTING INTERVENTIONS
CHAPTER 16: Frameworks and Methods for Implementing Interventions
16.1 IMPLEMENTATION FRAMEWORKS
16.2 GUIDANCE FOR APPLYING THE IMPLEMENTATION PROCESS
16.3 RESEARCH DESIGNS FOR EVALUATING IMPLEMENTATION INITIATIVES
REFERENCES
Index
End User License Agreement
Chapter 1
TABLE 1.1 Phases of the process for designing, evaluating, and implementing i...
Chapter 2
TABLE 2.1 Formal definitions of interventions.
TABLE 2.2 Examples of programs or multicomponent interventions.
TABLE 2.3 Examples of specific media for providing interventions.
Chapter 3
TABLE 3.1 Summary of the theory of insomnia.
TABLE 3.2 Matrix for analysis of theories.
TABLE 3.3 Narrative review of literature on determinants of insomnia.
Chapter 4
TABLE 4.1 Information to extract from selected reports of empirical studies.
TABLE 4.2 Experiential approach: steps for conducting group meeting with expe...
TABLE 4.3 Combined approach: steps for conducting group meetings.
Chapter 5
TABLE 5.1 Configurations of intervention theory.
Chapter 7
TABLE 7.1 Overview of stimulus control therapy.
TABLE 7.2 Resources needed to deliver the first session of stimulus control t...
TABLE 7.3 Excerpts of manual for delivering sessions 1 and 2 of stimulus cont...
Chapter 9
TABLE 9.1 Recently mentioned frameworks of intervention fidelity.
TABLE 9.2 Conceptual definitions of intervention fidelity.
TABLE 9.3 Matrix for examining theoretical fidelity of one active ingredient ...
Chapter 10
TABLE 10.1 Phases for intervention evaluation.
Chapter 11
TABLE 11.1 Treatment information covered in first intervention session or mod...
TABLE 11.2 Excerpt of rating of acceptability of sleep education and hygiene.
Chapter 13
TABLE 13.1 Approaches/frameworks for process evaluation.
TABLE 13.2 Description of Sleep Education component of the Behavioral Therapy...
Chapter 14
TABLE 14.1 Overview of designs and methods addressing the limitations of the ...
TABLE 14.2 Uses of qualitative research methods in outcome evaluation studies...
Chapter 15
TABLE 15.1 Participant characteristics associated with attrition.
Chapter 3
FIGURE 3.1 Representation of theory of insomnia.
Chapter 5
FIGURE 5.1 Logic model—diagram of relationships proposed by intervention the...
FIGURE 5.2 Logic model—flowchart.
Cover Page
Title Page
Copyright Page
Preface
Acknowledgments
Table of Contents
Begin Reading
Index
Wiley End User License Agreement
iii
iv
xi
xii
xiii
1
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
29
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
SECOND EDITION
Souraya Sidani
Ryerson University
Toronto, ON, Canada
Carrie Jo Braden
University of Texas Health Science Center
San Antonio, TX, USA
This edition first published 2021© 2021 John Wiley & Sons Ltd
Edition HistoryWiley Blackwell (1e 2011)
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, except as permitted by law. Advice on how to obtain permission to reuse material from this title is available at http://www.wiley.com/go/permissions.
The right of Souraya Sidani and Carrie Jo Braden to be identified as the authors of this work has been asserted in accordance with law.
Registered OfficesJohn Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, USAJohn Wiley & Sons Ltd, The Atrium, Southern Gate, Chichester, West Sussex, PO19 8SQ, UK
Editorial Office9600 Garsington Road, Oxford, OX4 2DQ, UK
For details of our global editorial offices, customer services, and more information about Wiley products visit us at www.wiley.com.
Wiley also publishes its books in a variety of electronic formats and by print‐on‐demand. Some content that appears in standard print versions of this book may not be available in other formats.
Limit of Liability/Disclaimer of WarrantyThe contents of this work are intended to further general scientific research, understanding, and discussion only and are not intended and should not be relied upon as recommending or promoting scientific method, diagnosis, or treatment by physicians for any particular patient. In view of ongoing research, equipment modifications, changes in governmental regulations, and the constant flow of information relating to the use of medicines, equipment, and devices, the reader is urged to review and evaluate the information provided in the package insert or instructions for each medicine, equipment, or device for, among other things, any changes in the instructions or indication of usage and for added warnings and precautions. While the publisher and authors have used their best efforts in preparing this work, they make no representations or warranties with respect to the accuracy or completeness of the contents of this work and specifically disclaim all warranties, including without limitation any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives, written sales materials or promotional statements for this work. The fact that an organization, website, or product is referred to in this work as a citation and/or potential source of further information does not mean that the publisher and authors endorse the information or services the organization, website, or product may provide or recommendations it may make. This work is sold with the understanding that the publisher is not engaged in rendering professional services. The advice and strategies contained herein may not be suitable for your situation. You should consult with a specialist where appropriate. Further, readers should be aware that websites listed in this work may have changed or disappeared between when this work was written and when it is read. Neither the publisher nor authors shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.
Library of Congress Cataloging-in-Publication Data
Names: Sidani, Souraya, author. | Braden, Carrie Jo, 1944– author.Title: Nursing and health interventions : design, evaluation and implementation / Souraya Sidani, Carrie Jo Braden.Other titles: Design, evaluation, and translation of nursing interventionsDescription: Second edition. | Hoboken, NJ : Wiley-Blackwell, 2021. | Preceded by Design, evaluation, and translation of nursing interventions / Souraya Sidani, Carrie Jo Braden. 2011. | Includes bibliographical references and index.Identifiers: LCCN 2021013467 (print) | LCCN 2021013468 (ebook) | ISBN 9781119610120 (paperback) | ISBN 9781119610137 (adobe pdf) | ISBN 9781119610090 (epub)Subjects: MESH: Nursing Care | Evaluation Studies as Topic | Nursing Research | Research Design | Translational Medical ResearchClassification: LCC RT81.5 (print) | LCC RT81.5 (ebook) | NLM WY 100.1 | DDC 610.73072–dc23LC record available at https://lccn.loc.gov/2021013467LC ebook record available at https://lccn.loc.gov/2021013468
Cover Design: WileyCover Image: © Alexkich/iStock/Getty Images
Interventions constitute the essence of nursing and health care. To be successful in promoting health and well‐being, health interventions have to be carefully designed, delivered, and evaluated, before they are implemented in practice.
Over the past decades, advances in intervention research generated a range of designs, methods, and procedures for developing, providing, and determining the effectiveness of health interventions. The advances were motivated by cumulating evidence pointing to limitations in traditional approaches to intervention research, by the widening recognition of the value of client‐centered care, and by the increasing demand for interventions that are acceptable and adaptable to practice and for evidence that is relevant and meaningful in informing treatment decisions in practice. Practice‐relevant evidence indicates: what clients, presenting with which personal and health characteristics, benefit, to what extent, from what health intervention, provided in what mode or format and what dose, in what context, as well as how health interventions work in producing the beneficial outcomes that are of importance to clients.
Advances in intervention research have been described in a multitude of sources spanning different disciplines and professions, and often using different terminology. This book is intended to serve as a helpful “one‐stop” resource for researchers and health professionals planning to engage in intervention research. The book is divided into five sections. The first section provides an overview of the conditions that instigated the advances, and of the systematic process for designing and evaluating health interventions. The second section presents approaches for developing new interventions that culminate in the generation of the intervention theory. The central role of the theory in guiding the planning and conduct of intervention delivery and evaluation is clarified. The third section details approaches and methods for delivering the intervention with fidelity and flexibility. The fourth section describes traditional and alternative research designs, methods, and procedures for evaluating the interventions’ acceptance, feasibility, process, and outcomes. The fifth section provides an overview of initiatives aimed at implementing evidence‐based interventions in practice.
The content of the book covers conceptual, empirical, and practical knowledge needed for the optimal design, delivery, evaluation, and implementation of health interventions. The conceptual knowledge clarifies the rationale or the “why” of the approaches and methods for designing and evaluating interventions; it explains the principles and logic underlying them and discusses their strengths and limitations. Empirical knowledge supports the utility of the approaches and methods by providing evidence on their strengths and limitations. The conceptual and empirical knowledge are combined to justify methodological decisions in intervention research. Practical knowledge describes the “what” and “how” of the approaches and methods; it provides guidance for applying them.
The goal is to support students, researchers, and health professionals in making appropriate decisions in:
Selecting the optimal approaches and methods for designing, delivering, evaluating, and implementing health interventions.
Generating evidence that informs treatment decisions in practice.
Promoting the adoption, adaptation, and implementation of intervention in practice, which ultimately lead to the provision of high‐quality, client‐centered, healthcare.
The authors gratefully acknowledge the informal feedback from colleagues, the constant challenge of students, and the instrumental support of staff (at the Health Intervention Research Center), which contributed to the refinement of our thinking and continuous expansion and evaluation of their conceptual and methodological knowledge.
This work could not be accomplished without the love, encouragement, and unlimited support of their family (in particular Cara Ager and Leila Sidani) and mentors.
This work was partially supported by the Canada Research Chairs Program.
Clients experience health problems and seek assistance from health professionals to address these problems. Health professionals are responsible for providing high‐quality healthcare that successfully manages the clients' problems and promotes their well‐being. Specifically, health professionals are expected to engage with clients in making treatment‐related decisions; provide the selected treatment or intervention; monitor clients' responses to treatment, that is, changes in the experience of the health problems; and adapt the intervention, as needed, to clients' responses. Thus, interventions constitute the central elements of healthcare, and their careful selection and appropriate delivery form the basis of high‐quality care. Sound decision‐making demands that health professionals: are aware of available interventions addressing the health problem with which clients present and of evidence regarding the benefits or effectiveness and the risks or discomfort associated with alternative interventions; inquire about clients' values and preferences; and collaboratively choose the intervention that is beneficial yet consistent with clients' values and preferences. Intervention research is concerned with generating evidence on the benefits and risks of health interventions, to inform treatment decision‐making in practice.
What constitutes empirical evidence that is useful in informing treatment decisions in practice is evolving. With the adoption of evidence‐based practice in the early 1990s, evidence from randomized controlled or clinical trials (RCTs), also known as experimental designs, was admitted as the most reliable or robust evidence of health interventions' benefits; the RCT is believed to have features that enhance the validity of inferences regarding the causal impact of interventions on outcomes (Holm et al., 2017). Consequently, evidence synthesized across RCTs was relied on to generate guidelines that inform decision‐making in practice.
In the past decades, experiences with evidence‐based practice revealed limitations of this approach to healthcare (Horwitz et al., 2017). These limitations, along with increasing societal value and demand for person‐centeredness, are contributing to shifts in perspectives on what constitutes high‐quality healthcare and what methods and strategies are appropriate for the design, delivery, and evaluation of interventions.
In this chapter, the treatment decision‐making steps and the information needed to guide decision‐making in practice are briefly reviewed. Limitations of evidence derived from RCTs in informing decision‐making in practice are highlighted and related to disregarding the principles of client‐centeredness and the complexity of the real world. Advances in research methodology that account for complexity are introduced. The overall process for designing, delivering, and evaluating interventions, and implementing them in practice, is briefly described.
Health professionals (i.e. practitioners, clinicians, therapists) include nurses, physicians, psychologists, dietitians, health educators, and allied health therapists such as respiratory, physical, occupational, and speech‐language therapists. They work independently and collaboratively to provide high‐quality healthcare to individuals, families, and communities (hereafter collectively referred to as clients) in a range of settings such as primary, home, acute, rehabilitation, and long‐term care.
Provision of high‐quality healthcare aims to: promote health; prevent and manage health problems; prevent complications; and maintain or improve well‐being. Currently, person‐, patient‐, or client‐centered care is viewed as the cornerstone of high‐quality care (Van Belle et al., 2019), and client participation in treatment decisions and in health management as the pillar of client‐centered care (Britten et al., 2017). Client participation is enacted in shared decision‐making. Shared decision‐making is an interactive process that involves collaboration between clients and health professionals, focused on making treatment decisions (Stacey & Légaré, 2015). The application of shared decision‐making in practice involves several steps (Coutu et al., 2015; Elwyn et al., 2014; Muscat et al., 2015; Shay & Lafata, 2014):
Health professionals conduct a comprehensive and thorough assessment of clients' condition. The assessment covers all domains of health, including biophysiological, physical, cognitive, emotional, behavioral, sociocultural, and spiritual domains, as well as the clients' personal account of the health problem and its impact on their life.
Health professionals critically analyze the findings of the assessment and, together with the clients, formulate the clients' health needs, that is, the potential (i.e. at risk) and actual health problems with which clients present and requiring remediation; in addition health professionals work with clients to prioritize their problems. An in‐depth and lucid understanding of the clients' health problems is necessary for selecting the remediation interventions.
Health professionals appraise alternative interventions; discuss with the clients the benefits and risks of the interventions; elicit clients' preferences for interventions; and collaboratively select the most appropriate, effective, and safe ones.
Health professionals deliver the selected interventions for, on behalf, or with clients.
Health professionals and clients monitor clients' status on a regular basis to determine the extent to which the interventions were successful in addressing clients' health problems. If unsuccessful, health professionals and clients investigate factors that may influence the effectiveness of the interventions and should be accounted for in adapting the same interventions or providing alternative ones.
The application of decision‐making in practice requires a sound theoretical and empirical knowledge base of the health problem and the interventions. This implies that health professionals have access to:
A conceptualization of the health problem with which clients present. The conceptualization clarifies the nature of the problem; specifies its indicators; presents a comprehensive list of its determinants experienced in different domains of health and at different levels (e.g. interpersonal, environmental); and delineates the relationships between the health problem and other co‐occurring problems, which are likely to be observed with the increasing prevalence of multiple chronic conditions, particularly among older adults (Golfam et al.,
2015
). The conceptualization helps health professionals understand the clients' health problem and overall condition, which, in turn, guides the search for relevant interventions.
A conceptualization of alternative (if available) interventions addressing the same health problems. The conceptualization describes the components and activities comprising each intervention, and explains its mechanism of action (i.e. why and how the interventions work in addressing the health problem). This conceptualization is important for understanding the interventions and for adapting them to the clients' context or life circumstances (Levinton,
2017
). Health professionals find it difficult to apply interventions they do not understand (Kazdin,
2007
).
Empirical evidence that indicates the extent to which each alternative intervention is effective and safe in addressing the health problem, as well as its relative effectiveness (i.e. compared to other interventions).
Empirical evidence on the effectiveness of interventions has been regarded as a credible source to inform health professionals' practice and to guide decision‐making. It forms the foundation of evidence‐based practice.
Evidence‐informed or evidence‐based practice refers to “the conscientious, explicit, and judicious use of current, best evidence in making decisions about the care of individual patients” (Sackett et al., 1997, p. 2). Proponents of evidence‐based practice believe that interventions, evaluated in the context of research studies and found effective and safe, can be delivered in the same and consistent manner to produce the same effects in clients presenting with the same health problem, under the conditions of day‐to‐day practice. They advocate the development of guidelines to inform practice. Guidelines consist of systematically developed statements about recommendations for interventions that have demonstrated effectiveness and can be used to address a health problem, and procedures for monitoring the intervention's outcomes. The guidelines are disseminated to health professionals who are expected to implement the recommended interventions (Fernandez et al., 2015).
Proponents of evidence‐based practice developed a hierarchy of research designs that are most appropriate for generating evidence on the effectiveness of interventions. They place high value on evidence derived from primary or meta‐analytic studies that used the RCT design to investigate the effects of interventions. The RCT is deemed the most reliable, even the “gold standard” for intervention evaluation research because its features are believed to minimize potential biases. Controlling for biases is required for demonstrating the causal effects of the intervention on outcomes (Hansen & Tjørnhøj‐Thomsen, 2016; Holm et al., 2017).
To date, experiences with evidence‐based practice have been less than optimal. It is estimated that up to 55% of clients receive interventions recommended in guidelines for acute, chronic, and preventive healthcare, and if provided, wide variations in implementing the evidence‐based interventions were observed (Greenhalgh et al., 2014; Harris et al., 2017). Several factors related to the characteristics of the healthcare system, organization, health professionals, clients, and the interventions affect the implementation of evidence‐based interventions and guidelines in daily practice (Lau et al., 2016). Evidence suggests that health professionals do not depend on research as a source of information to guide practice. Rather, they rely on other sources, primarily clinical knowledge either gained personally or shared by colleagues, as well as client experience (Spenceley et al., 2008).
Recently, concerns have been raised about the applicability of evidence, derived from primary and meta‐analytic studies using the RCT design, in informing practice (Ioannidis, 2016). Overall, the concerns stem from limitations of the RCT design in generating evidence that is relevant to the practice context (Braithwaite et al., 2018; Reeve et al., 2016). The limitations are related to the features of the RCT (i.e. careful selection of participants, random assignment, standardized delivery of treatment) that enable the focus on the direct causal effects of an intervention on outcomes and the control of potential sources of bias. As such, the RCT features ignore the complexity of the real world, the individuality of clients' experiences of the health problem and life circumstances as well as responses to treatment, and clients' participation in treatment decisions.
Careful selection of participants confines the RCT sample to a select subgroup of the target client population (e.g. clients with no comorbid conditions), which limits the applicability of the findings to other subgroups of clients seen in practice (Greenhalgh et al., 2014). Random assignment of participants to treatment groups does not reflect the treatment decision‐making process followed in practice. Therefore, random assignment is not well received by clients participating in the RCT (thereafter referred to as participants) and has been found to affect enrollment in the trial, attrition and nonadherence to treatment, which weaken the validity of inferences regarding the effectiveness of an intervention (see Chapter 14). Standardized delivery of interventions is not responsive to clients' individual experiences, life circumstances, and preferences. Standardization also is difficult to transport into practice due to the complex and inter‐related influence of factors pertaining to clients, health professionals, and context (Chu & Leino, 2017; Leask et al., 2019). The focus on the average direct causal effects of the intervention ignores individual variability in clients' responses to treatment (i.e. level of improvement in outcomes observed following treatment completion) and the mechanism through which the treatment produces its benefits; yet, health professionals need to understand what client subgroups respond favorably to the intervention and how the intervention produces its benefits for making appropriate treatment decisions (Horwitz et al., 2017; Lipsitz & Markowitz, 2013; Van Belle et al., 2016).
The limitations extend to meta‐analytic studies or systematic reviews of RCT findings, which form the basis for recommendations stated in guidelines. Attempts at synthesizing RCT‐derived evidence face challenges associated with limited replication (e.g. Pereira & Ioannidis, 2011). Limited replication is manifested in conflicting and, therefore, inconclusive evidence of the intervention's effectiveness (Hesselink et al., 2014). Accordingly, the guidelines' recommendations are usually stated in general terms that simply identify the interventions that can be used in addressing a health problem (Edwards et al., 2007). In addition, reports of primary and meta‐analytic studies as well as guidelines provide a brief description of the interventions. Insufficient description of the interventions constrains their replication and proper implementation in research and practice (Bach‐Mortensen et al., 2018; Levinton, 2017). For instance, Glasziou et al. (2010) found that health professionals were able to replicate the interventions evaluated in half of 80 studies published in the journal of Evidence‐Based Medicine. Furthermore, the guidelines do not offer instructions on how to adapt the design and delivery of interventions in a way that preserves their active ingredients yet is responsive to the characteristics, preferences, and life circumstances of clients and to the resources available in local practice contexts (Bach‐Mortensen et al., 2018; Westfall et al., 2009).
Accordingly, the evidence generated in primary and meta‐analytic studies using the RCT design is of limited utility in informing practice. It does not address the questions that health professionals ask when making treatment decisions (Bonell et al., 2018; Levinton, 2017). The questions include:
Who (i.e. clients with what sociodemographic and health or clinical profiles) most benefit (i.e. demonstrate improvement in outcomes) from an intervention, delivered in what mode and at what dose?
What are the intervention's active ingredients (operationalized in what specific components) responsible for its benefits?
What risks or discomforts are associated with the intervention?
How and why does the intervention work to produce its benefits? Or, what is the mechanism of action responsible for the intervention's effectiveness in addressing the health problem?
What resources are needed to deliver the intervention?
What contextual factors influence the delivery of the intervention by health professionals, its uptake and enactment by clients, and its effectiveness?
To what extent and how can the intervention be tailored to the individual clients' characteristics or preferences, and/or adapted to the local practice context?
What alternative interventions are available to address the health problem, and what are their relative benefits (effectiveness) and risks (safety)?
Intervention research needs to be reoriented toward developing well‐conceptualized yet practice‐relevant interventions, and generating the evidence that addresses these questions. The goal is to consolidate the theoretical and empirical knowledge that informs practice, and ultimately improves the quality of healthcare and the health of clients. To be useful in informing practice, intervention research should embrace a realist, pragmatic perspective in reflecting the characteristics of practice: client‐centeredness and complexity. This can be achieved through client engagement and use of a range of relevant research designs and methods.
The less‐than‐optimal experiences with evidence‐based practice, the limited applicability of RCT‐derived evidence to practice, in combination with clients' demand for an approach to healthcare that reflects their individuality, values, and preference, have led to the resurgence of client‐centered care as the “core” of high‐quality healthcare (Beck et al., 2010; de Boer et al., 2013; Sidani & Fox, 2014; Van Belle et al., 2019; Vijn et al., 2018).
Client‐centeredness is an approach to healthcare familiar to health professionals. Professionals are instructed, socialized, and expected to deliver client‐centered care. Client‐centered care is applied at different levels. At the individual level, it involves the application of tailored and adaptive interventions addressing the presenting health problem or aiming to change health behaviors (Hekler et al., 2018) and personalized or precision medicine (Bothwell et al., 2016). At the group level, client‐centeredness is illustrated by family‐centered care or the provision of health interventions that are adapted to the demands and preferences of particular communities such as ethno‐cultural communities (Barrera et al., 2013; Netto et al., 2010). At the healthcare organization level, client‐centeredness involves the adaptation of evidence‐based interventions and practice guidelines to the local context (Harrison et al., 2010; Powell et al., 2017) and at the system level, it is reflected in patient engagement (McNeil et al., 2016).
In general, the application of client‐centered care involves: (1) a comprehensive and thorough assessment of the clients' condition to identify their health problems, beliefs, values and preferences; (2) collaboration and active participation of clients in prioritizing their problems, designing new or selecting available, evidence‐based interventions, and implementing the selected interventions (as is done in shared decision‐making); and (3) adaptation or tailoring of the intervention for consistency with clients' problems, beliefs, values, and preferences, as well as with their changing experiences of the health problem, and life circumstances, over time.
Cumulating evidence supports the benefits of client‐centered care. At the individual level, client‐centered care was found to improve clients' knowledge of their condition and treatment, experiences with healthcare, general health and well‐being. It also enhanced adherence to treatment; self‐efficacy in managing the health problem; and reduced health services use and cost (Barello et al., 2012; Fors et al., 2018; Hibbard & Greene, 2013; Ren et al., 2019; Vijn et al., 2018). Similarly, tailored interventions were reported to be more effective than non‐tailored ones (Hawkins et al., 2008; Richards et al., 2007). At the community level, providing culturally tailored interventions was associated with increased client satisfaction but not with improvement in health outcomes (Renzoto et al., 2013). At the healthcare organization and system level, client‐centered care contributed to the development of new and improved services (Mockford et al., 2012).
The provision of client‐centered care, the cornerstone of high‐quality healthcare (Van Belle et al., 2019), requires the availability of interventions with demonstrated appropriateness, acceptability, effectiveness, safety, and efficiency. Appropriate interventions are logical, reasonable, and sound treatments that address a specific health problem. This implies that the nature of the interventions, reflected in its active ingredients, is consistent with the nature of the health problem.
Acceptable interventions are desirable by clients expected to receive the interventions. Desirable interventions are perceived as consistent with the clients' beliefs about the health problem and its treatment, suitable to their lifestyle, safe and convenient to apply in their daily life (Sidani et al., 2018). Related to acceptability is the notion of cultural relevance of interventions; it refers to the congruence of the interventions' components, mode and dose of delivery, with the beliefs, values, and norms held by particular groups or ethno‐cultural communities (Barrera et al., 2013). Effective interventions produce the best health outcomes by activating the anticipated mechanism of action (Dalkin et al., 2015); that is, they induce changes in clients' cognition, skills, or behaviors that mediate improvements in the experience of the health problem, health, and well‐being. Safe interventions are associated with no or minimal risks or discomfort (Bonell et al., 2015). Efficient interventions are optimized in terms of content, delivery, and resources required for their implementation, to maximize health outcomes; that is, they yield the highest impact (i.e. large improvement in the outcomes in a large proportion of the population) within a reasonably short time period (i.e. speed of recovery) (Benedikt et al., 2016; Morin et al., 2014).
New approaches are needed to design and evaluate health interventions in ways that inform the application of client‐centeredness in practice. Approaches for designing (1) appropriate health interventions rely on generating a comprehensive understanding of the health problem (see Chapter 3) and identification of the intervention's active ingredients (see Chapter 4), which are integrated into the intervention theory (see Chapter 5); (2) acceptable interventions involve the engagement of clients in the design of interventions, the development of tailored or adaptive interventions (see Chapter 4), and the assessment of clients' perceived acceptability of interventions (see Chapter 11). Approaches for evaluating the effectiveness, safety, and efficiency of health interventions entail the recognition of the complexity of the real world (see Section 1.4) and use of a range of research designs and methods to find answers to the practice‐related questions listed in Section 1.2. The goal of intervention research is to generate evidence that is grounded in and useful to practice (Westfall et al., 2009), which is characterized as client‐centered.
The complexity of the real world is a fact. Clients live in a complex environment where multiple factors contribute to their health and their capacity to promote healthy living. They may experience one or more health problems associated with a range of determinants. These complex health problems require complex interventions for successful remediation. Several health professionals are involved in the delivery of complex intervention, in a context that is characterized by factors, operating at different levels and contributing to the success (or failure) of the intervention implementation and effectiveness. The complexity of the real world should be accounted for, and not ignored as is the case in the RCT design, when developing and evaluating health interventions in order to generate evidence of relevance to practice.
Accounting for complexity demands acknowledgement of multi‐causality in the design and evaluation of health interventions, as well as of the individuality of clients. This can be achieved with the development of theory of the health problem (see Chapter 3), multicomponent interventions (see Chapter 4), theory of change (see Chapter 4 and 5), as well as examining the influence of contextual factors on the implementation of the intervention (see Chapter 13) and individual variability in clients' responses to interventions.
In practice, many clients present with one or more health problems and a range of life circumstances (or context). The experience of each problem may be associated with multiple determinants or causes, occurring in different domains of health (e.g. physical, psychological) and at different levels (e.g. intrapersonal, interpersonal, environmental) (Diez‐Roux, 2011). The problems and their determinants are often inter‐related, forming a “web of causation” (Golfam et al., 2015), also called multi‐causality. Understanding these inter‐relations is essential for designing and evaluating health interventions; this can be achieved with the development of the theory of the health problem to be targeted by an intervention. The theory of the health problem is a means for integrating the determinants of the health problem and delineating the complex inter‐relationships among them (Fleury & Sidani, 2018). The theory of the health problem points to aspects of the problem amenable to change, which informs the design of interventions. Interventions based on a clear understanding of the health problem were found to be most successful (e.g. Glanz & Bishop, 2010; Prestwich et al., 2014). The theory is also useful in guiding practice; it delineates aspects of the health problem that should be assessed, thereby ensuring a comprehensive and thorough assessment and understanding of the clients' condition, as advocated in client‐centered care.
Complex health problems require complex solutions. Complex interventions consist of multiple components (Medical Research Council, 2019). Each component involves a set of inter‐related activities, performed by clients and health professionals that have the common goal of managing a particular aspect (e.g. one determinant) of the health problem (Greenwood‐Lee et al., 2016). Complex interventions can be delivered in a standardized way whereby all clients are given all components, or tailored to the individual clients' experience of the health problem. In the latter case, clients are provided one component or a subset of components that is or are most appropriate to address the most salient aspect of the health problem as clients experience it.
Each component of a complex health intervention targets a particular aspect of the health problem, and, therefore, activates a unique mechanism of action. When a combination of components is delivered, the components may act interdependently in producing complex, multiple causal pathways that represent the mechanism of action responsible for the intervention's effects on the outcomes. The theory of change integrates these pathways (Mayne, 2015; Montague, 2019; Powell, 2019) to explain how the complex intervention, in its totality, works to bring about beneficial changes in the health problem and other outcomes (Bleijenberg et al., 2018).
The theory of change guides the plan and conduct of intervention evaluation studies. It identifies the interventions' processes, mediators, and outcomes to be investigated; gives direction for their measurement as well as the timing of their assessment; and assists in interpreting the findings. In practice, health professionals' awareness of the theory empowers engagement in an enlightened and judicious decision‐making process.
Interventions, complex or not, are delivered in real‐world context. The context includes the practice setting in which health professionals provide the interventions, and the environment in which clients apply the treatment recommendations. In either case, context is complex, characterized by variability on a multitude of factors, occurring at different levels (Chandler et al., 2016; Masterson‐Algar et al., 2018) and at different points in time (Cambon et al., 2019). The factors are embedded within the physical, psycho‐socio‐cultural, economic, and political setting or environment, and encountered at the micro (e.g. individual, home), meso (e.g. workplace, organization), or macro (e.g. healthcare system) levels. Contextual factors influence the implementation and the effectiveness of interventions (Craig et al., 2018). It, therefore, is essential to account for context in the design and evaluation of interventions. This can be achieved by: (1) engaging various stakeholder groups (e.g. clients, health professionals, decision‐makers) in the design and adaptation of interventions (Braithwaite et al., 2018; Greenwood‐Lee et al., 2016)—the groups are knowledgeable of the local context and provide valuable feedback on what factors are operating in that context, how to address the factors, and how to modify the intervention to enhance its fit with the features and resources available at the local context; and (2) incorporating a process evaluation within an intervention evaluation study (see Chapter 13) aimed to monitor the implementation of the intervention, examine its mechanism of action, and exploring if and how contextual factors affect interventions' implementation and effectiveness (Moore et al., 2019). The findings of a process evaluation are critical for the validity of a study's conclusions; specifically, they point to factors that contributed, positively or negatively, to the interventions' effects.
Individuality of clients adds to the complexity of real‐world practice. In addition to their experience of co‐occurring health problems, clients vary in their sociodemographic and health profiles, and most importantly differ in their beliefs about health in general and the presenting health problem such as possible causes of the problem. These beliefs influence clients' health behaviors and shape their preferences for treatment (De las Cuevas et al., 2018). Respecting their beliefs and accounting for their preferences are principles of client‐centered care that are gaining wide interest in intervention research. This is evident in: (1) calls to determine the social acceptability, in addition to clinical effectiveness and economic efficiency, of interventions (Staniszewska et al., 2010), and to design tailored interventions that customize interventions to individual clients' characteristics and preferences (Radhakrishan, 2012); and (2) widening recognition of the utility of pragmatic and preference trials for evaluating interventions. Clients also differ in their response to interventions: some experience improvement in the health problem, whereas others show no change or even deterioration. The latter subgroups of clients may require modification of their treatment, also referred to as adaptive interventions. The modification or adaptation may include a range of possibilities such as intensifying the interventions (e.g. increasing its dose) or providing different ones (e.g. stepped‐up care) based on their responses (Hekler et al., 2018). Advances in health technology are facilitating the design and delivery of adaptive interventions, and innovative research designs are proposed to evaluate them. Planned subgroup analysis can be applied to determine the profiles of clients who most benefit from the intervention.
The high value placed on client‐centered care and the less‐than‐optimal implementation of evidence‐based interventions by health professionals, and uptake and enactment by clients, served as the impetus for engaging clients in intervention research. Client engagement takes place in different stages and steps of research:
Identifying research priorities for funding agencies (e.g. Patient‐Centered Outcomes Research Institute in the US), most pressing health needs of the general public, or services requiring improvement in a healthcare organization or system: The James Lind Alliance has developed a systematic process for engaging clients (e.g. persons experiencing a health problem, health professionals) in identifying research priorities (Cowan,
2010
; Manafò et al.,
2018
).
Setting the research questions to be addressed in a study: Clients join the research team as collaborators. They actively participate in stating the study aims, and may assist in preparing or reviewing the grant proposal prior to submission.
Designing new interventions, co‐creating or co‐producing the intervention protocol and materials (Hwakins et al., 2017; Kildea et al.,
2019
), selecting and adapting evidence‐based interventions (Aarons et al.,
2012
; Sidani et al.,
2017
): This involves a systematic process in which clients serve as consultants or as participants in a research study aimed to adapt or co‐create interventions.
Delineating the study protocol: As collaborators, clients have experiential knowledge that is useful in determining: the target populations' acceptance of randomization and of methods for data collection; effective sources and strategies for recruitment; convenient locations for delivering the interventions; and suitability (comprehension, readability, response burden) of measures to a range of participants.
Recruiting participants, facilitating data collection, and assisting in the interpretation and dissemination of findings: Clients serving as collaborators or participants in a study can assume these responsibilities.
Client engagement in intervention research may reduce research waste. Client involvement is expected to: (1) identify research questions relevant to research or evidence users including clients or the general public, health professionals, and decision‐makers (Ioannidis, 2016; McLeod et al., 2014); (2) yield interventions that are optimally designed (Bleijenberg et al., 2018) and acceptable to users, which is likely to improve their uptake in practice; and (3) enhance participants' enrollment and retention, thereby reducing the resources, cost, and time needed to complete the study.
The increasing demand for addressing questions of relevance to practice (Chavez‐MacGregor & Giordano, 2016; Concato et al., 2010) and mounting evidence dispelling misconceptions about the strengths of the RCT and the weaknesses of non‐RCT or observation designs (Frieden, 2017) have brought to the forefront the importance of the research questions or aims in informing the selection of research designs and methods in intervention research (Skivington et al., 2018). Accordingly, researchers have a more inclusive range of research designs and methods to choose from. Designs considered appropriate for evaluating health interventions are presented in several publications (e.g. Medical Research Council guidance, 2019; Shadish et al., 2002; Sidani, 2015). The main categories of designs and methods are described in Chapters 14 and 15, respectively. The overall trend is toward embracing a pragmatic, realist approach to intervention evaluation research that is conducted within the context of practice and reflects the complexity of inter‐relations among client, health professional and contextual factors, intervention implementation, and outcomes. Practical trials, preference trials, adaptive designs, and multiple or mixed‐methods designs are relevant methodological innovations. The selection of a research design should be informed by the research questions, taking into consideration feasibility, ethical and safety issues (Lobo et al., 2017).
The process for designing, evaluating, and implementing interventions is systematic and rigorous, yet flexible and iterative (Czajkowski et al., 2015; Medical Research Council, 2019). It involves phases that are logically sequenced. Although some may be conducted simultaneously, the results of each phase drive the work forward toward the next phase or backward toward earlier phases. For instance, feasibility and acceptability can be examined simultaneously rather than sequentially. Newly developed interventions found acceptable to the target population are moved to the next phase for evaluating their effectiveness, whereas interventions deemed unacceptable should be reconceptualized to optimize their design (i.e. moved back to the drawing board!). Each phase is carried out using research designs and methods that are most pertinent to address the respective research questions or achieve the stated aims, and to maintain the validity of findings. The phases are briefly mentioned in Table 1.1, with an emphasis on what they aim to achieve. The book is organized into sections that detail the research methods that can be used in designing, evaluating, and implementing interventions. Different methods are discussed, consistent with the recommendation for selecting those that are most appropriate to address the research questions and are feasible within the practice context.
TABLE 1.1 Phases of the process for designing, evaluating, and implementing interventions.
Process
Phase
Aims
Designing interventions
Generating an understanding of health problem
Clarify the health problem requiring remediation
conceptual and operational definition
determinants
consequences
Develop a theory of the health problem
Identify aspects of the health problem that are amenable to change or remediation
Developing intervention
Conceptualize the intervention's active ingredients
Operationalize the intervention's active ingredients in specific and nonspecific components
Operationalize the components
goals and activities
mode of delivery
dose
Delineate the intervention's mechanism of action
Develop the theory of change
Developing intervention theory
Integrate theory of the problem and theory of change
Identify contextual factors affecting intervention's delivery, mechanism of action, and effectiveness
Operationalize the intervention theory
Delivering interventions
Developing intervention protocol
Describe in detail the content to be covered and activities to be performed by health professionals (or interventionists) and clients during intervention delivery
Training health professionals or interventionists
Select interventionists based on well‐defined professional qualifications and personal characteristics
Provide training and support
Monitoring fidelity
Assess theoretical fidelity
Develop or select measures for assessing operational fidelity
Investigate operational fidelity throughout the intervention delivery period
Evaluating interventions
Examining perceptions of intervention
Assess clients' perceptions of interventions before, during, and following delivery
Examining feasibility of intervention and research methods
Assess interventionists' perceptions of the intervention's feasibility
Assess clients' perceptions of the intervention's feasibility
Determine the acceptability and feasibility of the research methods for intervention evaluation
Revise the design of the intervention and the research methods as necessary
Evaluating process
Monitor the delivery of the intervention
Examine the intervention's mechanism of action
Explore contextual factors affecting the intervention's delivery, mechanism of action, and effectiveness
Revise the conceptualization, operationalization, and delivery of the intervention, based on process evaluation results
Evaluating outcomes
Determine the effectiveness of the intervention in producing the intended beneficial health outcomes
Explore the safety (risks or discomforts associated with the intervention) and unintended outcomes
Implementing interventions
Adapting evidence‐based interventions
Examine the acceptability and feasibility of the intervention to the local context
Explore modifications required to enhance the fit of the intervention to the local context
Assess barriers and facilitators of implementation in the local context
Select implementation techniques
Implementing evidence‐based interventions
Support the implementation initiative
Aarons, G.A., Green, A.E., Palinkas, L.A., et al. (2012) Dynamic adaptation process to implement an evidence‐based child maltreatment intervention.
Implementation Science
, 7, 32–40.
Bach‐Mortensen, A.M, Lange, B.C.L., & Montgomery, P. (2018) Barriers and facilitators to implementing evidence‐based interventions among third sector organization: A systematic review.
Implementation Science
, 13, 103–121.
Barello S, Graffigna G, & Vegni E. (2012) Patient engagement as an emerging challenge for healthcare services: Mapping the literature.
Nursing Research and Practice
, 2012, 905–934.
Barrera, M., Castro, F.G., Strycker, L.A., et al. (2013) Cultural adaptations of behavioral health interventions: A Progress report.
Journal of Consulting and Clinical Psychology
, 81(2), 196–205.
Beck, C., McSweeney, J.C., Richards, K.C., et al. (2010) Challenges in tailored intervention research.
Nursing Outlook
, 58(2), 104–110.
Benedikt, C., Kelly, S.L., Wilson, D., & Wilson, D.P., on behalf of the Optima Consortium (2016) Allocative and implementation efficiency in HIV prevention and treatment for people who inject drugs.
International Journal of Drug Policy
, 38, 73–80.
Bleijenberg, N., de Man‐van Ginkel, J.M., Trappenburg, J.C.A., et al. (2018) Increasing value and reducing waste by optimizing the development of complex interventions: Enriching the development phase of the Medical Research Council (MRC) framework.
International Journal of Nursing Studies
, 79, 86–93.
de Boer, D., Delnoij, D., & Rademakers, J. (2013) The importance of patient‐centered care for various groups.
Patient Education and Counseling
, 90, 405–410.
Bonell, C., Jamal, F., Melendez‐Torres, G.J., & Cummins, S. (2015) ‘Dark logic’: Theorizing the harmful consequences of public health interventions.
Journal of Epidemiology & Community Health
, 69, 95–98.
Bonell, C., Moore, G., Warren, E., & Moore, L. (2018) Are randomised controlled trials positivist? Reviewing the social science and philosophy literature to assess positivist tendencies of trials of social interventions in public health and health services.
Trials
, 19(1), 238–249.
Bothwell, L.E., Greene, J.A., Podolsky, S.H., et al. (2016) Assessing the gold standard—Lessons from the history of RCTs.
New England Journal of Medicine
, 374, 2175–2181.
Braithwaite, J, Churruca, K, Long, J.C., et al. (2018) When complexity science meets implementation science: A theoretical and empirical analysis of system change.
BMC Medicine
, 16, 63–76.
Britten N, Moore L, Lydahl D, et al. (2017) Elaboration of the Gothenburg model of person‐centred care.
Health Expectations
, 20, 407–418.
Cambon, L., Terral, P., & Alla, F. (2019) From intervention to interventional system: towards greater theorization in population health intervention research.
BMC Public Health
, 19, 389–345.
Chandler, J., Rycroft‐Malone, J., Hawkes, C., & Noyes, J. (2016) Application of simplified complexity theory concepts for healthcare social systems to explain the implementation of evidence into practice.
Journal of Advanced Nursing
, 72(2), 461–480.
Chavez‐MacGregor, M. & Giordano, S.H. (2016) Randomized clinical trials and observational studies: Is there a battle?
Journal of Clinical Oncology
, 34, 772–773.
Chu, J. & Leino, A. (2017) Advancement in the maturing science of cultural adaptations of evidence‐based interventions.
Journal of Consulting and Clinical Psychology
, 85(1), 45–57.
Concato, J., Peduzzi, P., Huang, G.D., et al. (2010) Comparative effectiveness research: What kind of studies do we need?
Journal of Investigative Medicine
, 58, 764–769.
Coutu, M.‐F., Légaré, F., Stacey, D., et al. (2015) Occupational therapists' shared decision‐making behaviors with patients having persistent pain in a work rehabilitation context: A cross‐sectional study.
Patient Education & Counseling
,
98, 864–970.
Cowan, K. (2010) The James Lind Alliance: Tackling treatment uncertainties together.
The Journal of Ambulatory Care Management
, 33(3), 241–248.
Craig, P., Di Ruggiero, E., Frohlich, K.L., Mykhalovskiy, E., & White, M., on behalf of the Canadian Institutes of Health Research (CIHR)—National Institute for Health Research (NIHR). (2018).
Taking Account of Context in Population Health Intervention Research: Guidance for Producers, Users and Funders of Research
. NIHR Evaluation, Trials and Studies Coordinating Centre, Southampton.
Czajkowski, S.M., Powell, L.H., Adler N., et al. (2015) From ideas to efficacy: The ORBIT model for developing behavioral treatments for chronic diseases.
Health Psychology
, 34(10), 971–982.
Dalkin, S.M., Greenhalgh, J., Jones, D., et al. (2015) What's in a mechanism? Development of a key concept in realist evaluation.
Implementation Science
, 10, 49–55.
De las Cuevas, C., Motuca, M., Baptista, T., & de Leon, J. (2018) Skepticism and phamacophobia toward medication may negatively impact adherence to psychiatric medications: A comparison among outpatient samples recruited in Spain, Argentina, and Venezuela.
Patient Preference and Adherence
, 12, 301–310.
Diez‐Roux, A.V. (2011). Complex system thinking and current impasses in health disparities research.
American Journal of Public Health
, 101(9), 1627–1634.
Edwards, N., Davies, B., Ploeg, J., Virani, T. & Skelly, J. (2007) Implementing nursing best practice guidelines: Impact on patient referrals.
BMC Nursing
, 6, 4–12.
Elwyn, G., Frosch, D., Thomson, R., et al. (2014) Shared decision making: A model for clinical practice.
Journal of General Internal Medicine
,
27(10), 1361–1367.
Fernandez, A., Sturmberg, J., Lukersmith, S., et al. (2015) Evidence‐based medicine: Is it a bridge too far?
Health Research Policy and Systems
, 13, 66–74.
Fleury, J. & Sidani, S. (2018). Using theory to guide intervention studies. In: B.M. Melnyk & D. Morrison‐Beedy (eds)
Intervention Research and Evidence‐Based Quality Improvement Second Edition: Designing, Conducting, Analyzing, and Funding
. Springer, New York, NY
Fors, A., Blanck, E., Ali, L., et al. (2018) Effects of a person‐centred telephone‐support in patients with chronic obstructive pulmonary disease and/or chronic heart failure—A randomized controlled trial.
PLoS One
, 13(8), e0203031.
Frieden, T.R. (2017) Evidence for health decision making—Beyond randomized, controlled trials.
New England Journal of Medicine
, 377, 465–475.
Glanz, K. & Bishop, D.B. (2010) The role of behavioral science theory in development and implementation of public health interventions.
Annual Review of Public Health
, 31, 399–418.
Glasziou, P., Chalmers, I., Altman, D.G., et al. (2010) Taking healthcare interventions from trial to practice.
BMJ
, 341, c3852.
Golfam, M., Beall, R., Brehaut J., et al. (2015) Comparing alternative design options for chronic disease prevention interventions.
European Journal of Clinical Investigation
, 45, 87–99.
Greenhalgh, T., Howick, J., & MasKreg, N, for the Evidence Based Medicine Renaissance Group (2014) Evidence based medicine: A movement in crisis?
BMJ
, 348, g3725.
Greenwood‐Lee, J., Hawe, P., Nettel‐Aguirre, A., et al. (2016) Complex intervention modelling should capture the dynamics of adaptation.
BMC Medical Research Methodology
, 16, 51–57.
Hansen, H.P. & Tjørnhøj‐Thomsen, T. (2016) Meeting the challenges of intervention research in health science: An argument for a multimethod research approach.
Patient
, 9, 193–200.
Harris, M., Lawn, S.J., Morello, A., et al. (2017) Practice change in chronic conditions care: An appraisal of theories.
BMC Health Services Research
, 17, 170–179.
Harrison, M., Legare, F., Graham, I., & Fervers, B. (2010) Adapting clinical practice guidelines to local context and assessing barriers to their use.
CMAJ
, 182 (2), 78–84.
Hawkins, R.P., Kreuter, M., Resnicow, K., et al. (2008) Understanding tailoring in communicating about health.
Health Education Research
,
23(3), 454–466.
Hawkins, J., Madden, K., Fletcher, A., et al. (2017) Development of a framework for the co‐production and prototyping of public health interventions.
BMC Public Health
, 17, 689–699.
Hekler, E.B., Rivera, D.E., Martin, C.A., et al. (2018) Optimizing adaptive interventions: Tutorial on when and how to use control systems engineering to optimize adaptive mHealth intervention.
Journal of Medical Internet Reseach
, 20(6), e214.
Hesselink, G., Zegers, M., Vernooij‐Dassen, M., et al. (2014) Improving patient discharge and reducing hospital readmissions by using intervention mapping.
BMC Health Services Research
, 14, 389–399.
Hibbard, J.H. & Greene, J. (2013) What the evidence shows about patient activation: Better health outcomes and care experiences.
Health Affairs
, 32(2), 207–214.
Holm, M., Alvariza, A., Fürst, C‐J, et al. (2017) Recruiting participants in a randomized controlled trial testing an intervention in palliative cancer care—The perspectives of health care professionals.
European Journal of Oncology Nursing
, 31, 6–11.
Horwitz, R.I., Hayes‐Conroy, A., Coricchio, R., & Singer, B.H. (2017) Fromm evidence based medicine to medicine based evidence.
The American Journal of Medicine
, 130, 1246–1250.
Ioannidis, J.P.A. (2016) Why most clinical research is not useful.
PLoS Medicine
, 13(6), e1002049.
Kazdin, A.E. (2007) Mediators and mechanisms of change in psychotherapy research.
The Annual Review of Clinical Psychology
, 3, 1–27.
Kildea, J., Battista, J., Cabral, B., et al. (2019) Design and development of a person‐centered patient portal using participatory stakeholder co‐design.
Journal of Medical Internet Research
, 21(2), e11371.
Lau, R., Stevenson, F., Ong, B.N., et al. (2016) Achieving change in primary care—causes of the evidence to practice gap: Systematic reviews of reviews.
Implementation Science
, 11, 40—50.
Leask, C.F., Sandlund, M., Skelton, D.A., et al. (2019) Framework, principles and recommendations for utilising participatory methodologies in the co‐creation and evaluation of public health interventions.
Research Involvement and Engagement
, 5, 2–17.
Levinton, L.C. (2017) Generalizing about public health intervention: A mixed‐methods approach to external validity.
Annual Review of Public Health
, 38, 371–391.
