Authors

  1. Korda, Holly PhD, MA

Abstract

Fidelity of implementation in replicating evidence-based health promotion interventions, to ensure implementation as designed and proven through research, is crucial if evidence-based community and population health interventions are to achieve promised results but can be difficult to attain in practice. This article highlights major challenges, considerations, and strategies associated with fidelity as public health care practitioners implement evidence-based interventions and bring them to scale in the field. There is need for sharing information about "what works" in implementing these interventions with fidelity, as well sharing information about implementation challenges and improvements to intervention design. Fidelity is important if practitioners are to obtain results and outcomes planned by intervention developers and is an essential value proposition for evidence-based public health practice and population health improvement.

 

Article Content

Current public health interest in community and population health promotion emphasizes broad dissemination of evidence-based interventions designed to address many varied health and wellness concerns, from interventions such as the Peers Reaching Out and Modeling Intervention Strategy1 (PROMISE) to help participants move toward safer sex behaviors to the Stanford Chronic Disease Self Management Program2 that helps individuals manage their chronic health conditions. Evidence-based interventions are composed of carefully structured components and processes, for example, defined materials, curricula, content, and more, that have been developed and tested on the basis of behavior change theory, are replicable, and have demonstrated efficacy and effectiveness through rigorous research. Dissemination of these interventions is part of wide-ranging multidisciplinary efforts to promote evidence-based strategies in public health. The Community Preventive Services Task Force3 and other leaders in public health and prevention recommend that public health departments and public health care practitioners incorporate evidence-based approaches and knowledge about "what works" as part of comprehensive community and population health strategies.4-6

 

What Is Fidelity?

Fidelity in implementing evidence-based interventions refers to the degree to which an intervention has been implemented and delivered as designed and is essential if evidence-based interventions are to provide the same results as those shown in studies. Including the same components as designed and proven in the intervention, using standard materials and approaches to train staff to deliver the intervention and providing the associated curriculum, delivery content and designated number of sessions are all needed to meet fidelity requirements. While differences in terminology exist across public health disciplines,7 fidelity typically refers to 5 domains: adherence (whether the intervention has been implemented as it was designed), exposure or dosage (the amount of the intervention participants receive compared with that described by the designers), quality of delivery (the manner in which an intervention is delivered by the trainer or facilitator), participant responsiveness (how far participants are engaged by the intervention), and program differentiation (unique features of different components that are essential for its success).8 A sixth domain, structure/process, or the service delivery framework and roles and behaviors of participants, has also been identified as a characteristic of fidelity.9

 

Brownson et al,10 Mihalic,11 Fagan et al,12 Mihalic et al,13 Hasson,14 and Noona et al's15 identify several key influencers of implementation fidelity. These include preplanning and motivations for adopting a program; program characteristics such as complexity and structure; training and technical support including training of individuals who deliver the interventions; integration of the program within the organization; organizational characteristics such as a positive organizational climate, agency stability and active leadership; and implementer characteristics such as motivation, support, and buy-in of staff implementing the program. These factors contribute to the fidelity of the evidence-based intervention and its success or failure in practice.

 

Meeting these requirements is not easy. The aforementioned list represents an ideal situation where staff, resources, organizational climate, mission, and leadership are all aligned to support intervention success. In practice, this alignment may be difficult to achieve. Numerous barriers to fidelity exist including individual variations in practitioner adherence and competence, lack of available training and technical support, limited resources for supporting the intervention at the site level, local adaptations of interventions, and competing demands for the practitioners' time that can diminish their commitment or effectiveness.16

 

Adapting, Monitoring, and Scaling Evidence-Based Interventions

Many experts question whether adaptations can be made to evidence-based interventions without compromising fidelity, but adaptations are sometimes needed when interventions are replicated with diverse populations and settings. O'Connor and colleagues17 contend that some approaches to adapting interventions, for example, changing or translating language, replacing cultural references and images to show participants that resemble the target audience, and adding evidence-based content to increase audience appeal, are acceptable. Other adaptations, such as reducing session length or lowering levels of participant engagement, removing topics or key messages and skills, changing the theoretical approach, and using fewer staff than recommended or staff who are inadequately trained, are risky or unacceptable and may undermine program effectiveness. When such changes are made, the result is a loss of fidelity and failure to produce the intended results or benefits of the intervention as designed. Still, failure to attain expected results does not necessarily mean that no results are produced.18 Practitioners delivering one well-known intervention noted they had changed the intervention because participants asked for a more rigorous program. The changed intervention lacked fidelity to the original, becoming a different intervention. However, positive outcomes were nevertheless achieved for the newly minted version.

 

Ongoing monitoring of evidence-based interventions is important for delivery organizations to maintain fidelity over extended periods, helping to ensure that evidence-based interventions continue to be provided consistently as programs address issues such as staff turnover and retraining needs, and changes to delivery settings over time. Little is known about whether and how fidelity is monitored in actual public health practice. Some public health departments or community-based delivery organizations have developed fidelity monitoring plans, whereas others may require staff to obtain refresher training in delivering the content of the intervention or conduct routine oversight, for example, using a checklist to ensure that interventions continue to meet requirements for training, curriculum, and content. More often, however, little attention is paid to fidelity monitoring because of barriers such as limited funding, resources, and lack of opportunities for sharing and disseminating information about practical approaches currently used by program implementers and practitioners.19

 

Public health care practitioners seeking to expand the reach of programs confront additional fidelity challenges as they expand or scale up evidence-based interventions, as practical considerations of infrastructure, staff, and resources to support intervention expansion are considered and addressed. Scaling up is not just an activity to expand numbers of persons served, but it also depends on the local organizational and social context of programs. Simmons et al20 describe this context as multidimensional, involving a variety of actors, interest groups, and organizations in a larger socioeconomic, political, cultural, and institutional environment. Within this complex environment, public health care practitioners must also consider what adaptations may be needed for different locales and populations, without compromising fidelity, and whether and how fidelity and monitoring issues can be addressed as programs are widely disseminated across diverse settings and populations.

 

A Value-Based Proposition for Public Health Practice

Evidence-based interventions bring the promise of individual and population health improvement, and when brought to scale, they hold the potential to increase public health impact, reach, and gains to participants and population health improvement. These interventions are important tools in the public health arsenal, and fidelity is the key to their optimal performance. While achieving and maintaining fidelity can be challenging, implementers may sell themselves short if they overlook this important component of intervention success. Sharing information across public health and other disciplines about practical approaches for implementing evidence-based interventions including approaches for ensuring fidelity and bringing programs to scale is a starting point. Sharing information about changes, intentional or nonintentional, that result in a loss of fidelity is also needed, including evidence about changes that result in both poor and improved participant outcomes. Given the potential of evidence-based programs to positively impact health and wellness, and the costs of widely disseminating interventions, ensuring fidelity is a small effort with a large impact-and an essential value proposition for evidence-based public health practice and population health improvement.

 

REFERENCES

 

1. Community PROMISE: Peers Reaching Out and Modeling Intervention Strategies for Community-level HIV/AIDS Risk Reduction. Atlanta, GA: Centers for Disease Control and Prevention. http://www.cdc.gov/hiv/topics/prev_prog/rep/packages/promise.htm. Accessed December 20, 2011. [Context Link]

 

2. Chronic Disease Self-Management Program. Palo Alto, CA: Stanford Patient Education Research Center. http://patienteducation.stanford.edu/programs/cdsmp.html. Accessed December 20, 2011. [Context Link]

 

3. Task Force on Community Preventive Services. The Guide to Community Preventive Services: What Works to Promote Health? New York, NY: Oxford University Press; 2005. [Context Link]

 

4. Novick LF, Kelter A. The guide to community preventive services: a public health imperative. Am J Prev Med. 2001;21(4s):13-15. [Context Link]

 

5. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175-201.

 

6. Anderson LM, Brownson RC, Fullilove MY, et al. Evidence-based public health policy and practice: promises and limits. Am J Prev Med. 2005;28(5s):226-230. [Context Link]

 

7. Kerner JF. Integrating research, practice, and policy: what we see depends on where we stand. J Public Health Manag Pract. 2008;14(2):193-198. [Context Link]

 

8. Dane A, Schneider B. Program integrity in primary and early secondary prevention: are implementation effects out of control. Clin Psychol Rev. 1998;18:23-45. [Context Link]

 

9. Century J, Rudnick M, Freeman C. A framework for measuring fidelity of implementation: a foundation for shared language and accumulation of knowledge. Am J Eval. 2010;31(2):199-218. [Context Link]

 

10. Brownson RC, Fielding JE, Maylahn CM. Evidence-based public health: a fundamental concept for public health practice. Annu Rev Public Health. 2009;30:175-201. [Context Link]

 

11. Mihalic S. Implementation fidelity. http://www.colorado.edu/cspv/blueprints/Fidelity.pdf. Accessed December 2, 2011. [Context Link]

 

12. Fagan AA, Hanson K, Hawkins D, Arthur MW. Bridging science to practice: achieving prevention program implementation fidelity in the community youth development study. Am J Community Psychol. 2008;41:235-249. [Context Link]

 

13. Mihalic SF, Fagan AA, Argamaso S. Implementing the LifeSkills Training drug prevention program: factors related to implementation fidelity. Implement Sci. 2008;3:5. [Context Link]

 

14. Hasson H. Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci. 2010;5:67. [Context Link]

 

15. Noonan RK, Emshoff JG, Mooss A, Armstrong M, Weinberg J, Ball B. Adoption, adaptation, and fidelity of implementation of sexual violence prevention programs. Health Promot Pract. 2009;10:59S-70S. [Context Link]

 

16. Dodson EA, Baker EA, Brownson RC. Use of evidence-based interventions in state health departments: a qualitative assessment of barriers and solutions. J Public Health Manag Pract. 2010;16(6):E9-E15. [Context Link]

 

17. O'Connor C, Small SA, Cooney SM. Program fidelity and adaptation: meeting local needs without compromising program effectiveness. In: What Works, Wisconsin-Research to Practice Series. Issue 4. Madison, WI: The University of Wisconsin-Madison; 2007:1-6. [Context Link]

 

18. Lara M, Bryant-Stephens T, Damitz M, et al. Balancing "fidelity" and community context in the adaptation of asthma evidence-based interventions in the "real world." Health Promot Pract. 2011;12(suppl 1):63S-72S. [Context Link]

 

19. Baker EA, Brennan Ramirez LK, Claus JM, Land G. Translating and disseminating research- and practice-based criteria to support evidence-based intervention planning. J Public Health Manag Pract. 2008;14(2):124-130. [Context Link]

 

20. Simmons R, Fajans P, Ghiron P, eds. Scaling Up Health Service Delivery: From Pilot Innovations to Policies and Programmes. Geneva, Switzerland: World Health Organization; 2007. [Context Link]

 

community-based interventions; evidence-based programs; fidelity; health promotion