Intervention fidelity, mHealth, Science rigor, Technology, Telehealth



  1. Bonar, Jaime Rachelle M. MOT, BA
  2. Wright, Shawna PhD
  3. Yadrich, Donna Macan BS, MPA
  4. Werkowitch, Marilyn BSN
  5. Ridder, Lavonne MSN
  6. Spaulding, Ryan PhD
  7. Smith, Carol E. PhD, RN


The current standard in healthcare research is to maintain scientific fidelity of any intervention being tested. Fidelity is defined as the consistent delivery of interventions that ensures that all participants are provided the same information, guidance, and/or materials. Notably, the methods for ensuring fidelity of intervention delivery must also be consistent. This article describes our Intervention and Technology Delivery Fidelity Checklists used to ensure consistency. These checklists were completed by trained nurse observers who rated the intervention implementation and the technology delivery. Across our clinical trials and pilot studies, the fidelity scores were tabulated and compared. Intervention information and materials were delivered by a variety of devices including telehealth monitors, videophones, and/or iPads. Each of the devices allows audiovisual connections between health professionals from their offices and patients and participants in their homes. Our checklists guide the monitoring of fidelity of technology delivery. Overall checklist ratings across our studies demonstrate consistent intervention, implementation, and technology delivery approaches. Uniquely, the fidelity checklist verifies the interventionist's correct use of the technology devices to ensure consistent audiovisual delivery. Checklist methods to ensure intervention fidelity and technology delivery are essential research procedures, which can be adapted for use by researchers across multiple disciplines.


Article Content

The term fidelity refers to a concept used widely in information science and healthcare research. Intervention fidelity is defined as staying true to the description of the intervention being tested and consistent delivery of that information to all research participants in the same manner.1 Advancements in technology have served to increase nursing, medical, and allied health providers' ability to deliver interventions to patients at home via affordable Internet options. Establishing intervention or treatment fidelity is challenging, especially when the intervention is delivered at a distance using telehealth technology. An important measure of telehealth intervention fidelity is consistent delivery across technology platforms.2,3 If delivery varies or if information is inconsistent, participants do not receive the same intervention. With such variation in a research study, the analysis of outcomes becomes impossible.4


The purpose of this article is to (1) describe the basic components and types of intervention and telehealth delivery fidelity procedures, and (2) illustrate how we used observation checklists to ensure interventions were delivered consistently and as planned across all participants in each study.5 Procedures include having a professional, who is not participating in the intervention delivery, observe and rate interventions using checklists. Positive checklist ratings indicate that an intervention was conducted as planned and delivered consistently.


Further, checklist examples and ratings from pilot and other studies are provided to illustrate observed fidelity. Checklists were designed for each study to measure consistent approaches specific to the intervention and technology used in the delivery. The principles and components that guide fidelity monitoring checklists are based in the National Institutes of Health (NIH) Best Practices national framework for intervention fidelity.6 This framework recommends strategies for maintaining and enhancing consistency of intervention delivery in the studies that NIH funds. The NIH Framework principles are (1) consistency in intervention content, information, and delivery; and (2) training the interventionists (ie, the nurse, psychologist, or health professional who implements the intervention) to deliver a consistent intervention. Maintaining NIH principles using strategies such as observation checklists is essential to good intervention and delivery fidelity. Concrete examples are described across a series of studies with patients who must infuse intravenous (IV) nutritional fluids daily. Given that participants were observed in their homes from a distance via audio and visual devices, fidelity was assessed for each technology delivery.


Fidelity in Interventions Delivered Via Technology

Audiovisual connections allow interventionists to observe and support patients in learning and following through with prescribed home care and health management tasks. Delivery into the home reduces the risk of exposure to infection or contagion associated with waiting rooms and hospital clinics and reduces patient travel. Technology to support patients in their homes is growing exponentially, yet there are few studies that test fidelity of interventions via audiovisual technology.3 Consistency and reliability of the method of delivery should be measured so that each intervention is delivered in the same manner.7-9 Further, consistent delivery allows other researchers to replicate the study.10,11 Thus, checklist procedures for assessing delivery fidelity were used across all our studies.


All intervention sessions began with the interventionist guiding the correct setup and use of the technology placed in participant homes (Table 1). These procedures allowed audiovisual delivery to be consistently understandable with clear visual connections.

Table 1 - Click to enlarge in new windowTable 1 Checklist Items Regarding Setup for Technical Delivery

Across all our studies, maintaining fidelity included placing the technology device where the two-way communication was clearest by using appropriate lighting and checking for adequate sound.12,13 Effective lighting was essential to clearly interact with participants and observe facial expressions and body language. Unique to fidelity strategies used in these studies is training for each technology device per study so that all participants receive consistent delivery. The intervention observation checklist aligns with previous methods developed for observing nursing care given in the home. An inconspicuous observer monitored each technology intervention delivery session and was trained to use the specific checklist rating scales, set up on a Likert (1-5) rating system. Observers fidelity for each delivery, such as video calls conducted in a private location with the technical device on a stable surface, volume was adequate, and picture was clear.


The fidelity checklist criteria assess various technology delivery devices such as videophones, telemedicine equipment, and audiovisual tablets. The checklist rating items were written to apply to the specific technology used in each study.12 Thus, checklist procedures were designed to ensure each technology device functioned at its highest possible level. Participants also evaluated the clarity of the delivery. Table 2 is a summary checklist from participants' anonymous evaluation of the early videophone technology in our studies. Participant ratings used a Likert scale from 1 (strongly disagree) to 5 (strongly agree) regarding technology delivery of interventions. Participants rated it easy to ask the nurse interventionist questions, see and understand what the interventionist was saying, and clearly see the illustration materials projected on the telehealth monitor. The high midrange scores (>3) given by participants concerning their desire to use the videophone to talk with others were encouraging.

Table 2 - Click to enlarge in new windowTable 2 Participant Checklist Ratings of Videophone Use and Delivery of Information


Specially trained nurse observers used the checklist to rate whether the interventionists followed training in the rare technical disconnections, managed any screen blurring due to movement, and scheduled subsequent technology sessions.14 Checklists also guided the observer to rate the delivery of the specified intervention information content, topics to discuss, or healthcare skills demonstrated.15 The fidelity checklists also included criteria-specific rating items for correct use of a variety of intervention approaches, such as behavioral skills training, resilience-building strategies, psychologists' adherence to consistent counseling techniques and approaches with patients,16,17 teaching about medications, and home monitoring of prescribed medical treatments. A specific fidelity checklist was generated for each study based on intervention content, materials provided to participants, and the specific technology used.


In all studies described here, the participants were drawn from the population of individuals prescribed lifelong, daily IV nutrient infusions to sustain their health.18 The research question and data from each study's fidelity procedure rating reported here are as follows: Were the interventions and technology delivery consistent across all participants in that study? Fidelity ratings were also summarized across the studies and reported here to verify each study had intervention and delivery implementation fidelity (consistency).19


Samples of Technology Devices and Telemedicine Equipment Used Across Studies

These studies used varying technology equipment. All devices were easily mailed or delivered to the participants' homes for ongoing and long-distance telecommunication visits. Across all studies, all participants provided informed consent per institutional review board (IRB) approval. Equipment loan agreement and image disclosure consent forms were obtained prior to scheduling the technology-delivered intervention visits. All long-distance telephone fees and Internet provider fees were covered at no cost to participants. All studies used only university medical center encrypted connections and IRB-approved Internet providers, which enabled firewall protection. It is important to note that the control groups in these studies also had technology-based audiovisual sessions using the same device on the same schedule and for the same length of time as the intervention group. However, the control group visits did not include the interventions being tested. Thus, this group controls for influences of the novel technology visits and the Hawthorne effect of being observed. Each study had a manual describing a specific intervention and approach to be used with that patient population. The research team observers rated interventions during each technology session.


Observation When Using the Fidelity Rating Checklists

The observer was trained to complete the checklist ratings during the delivery of each intervention. The directions at the top of the fidelity checklist stated, "These ratings are used to evaluate the administration of the intervention by the technology used. Checklists are not used to rate the participants' discussion comments or their reactions to the intervention content."


The checklist had a total of five labeled rating sections.Each item in the scale was rated numerically. The summed numbers for each section were averaged, resulting in an overall section rating score. Checklists were unique to each study, related to the specific technology intervention approach, and were used each time an audiovisual session was delivered.


Sections on the Intervention Fidelity Observation Checklist

The first section of each checklist, entitled Interventionist Technical Competence, included questions that rated the interventionist's ability to set up the technology equipment correctly and make the necessary connections prior to and during the intervention session. Each rating scale focused on the consistent audiovisual delivery through each specific technology device. For example, in all the videophone/iPad interventions, technical competence was assessed based on whether the interventionist checked that the (1) videophone/iPad camera was in focus; (2) technology device was placed at the correct distance in front of the participant for visual assessment; (3) location area was private, not public, suitable for a healthcare visit; and (4) videophone/iPad visit and audio clarity were at acceptable volume.


The second section of the checklist, entitled Interventionist Follows Intervention and Approach Manual, assessed whether the interventionist presented the information logically per the standardized manual and engaged participants in the intervention discussion. Facilitating discussion, not lecturing, was emphasized. Specific questions asked whether the interventionist (1) facilitated the participants' discussion, (2) followed each step of the delivery approach, (3) implemented information from the intervention guide, (4) engaged the participant in discussion of the intervention, and (5) elicited a "verbal response" that the participant would use the intervention approach and respond to future emailed reminders. This second section ensured that each participant received the same information and was engaged in discussions of when and how to use the approach in their own health/illness self-management.


The third section, Interventionist Assesses Patient Comprehension of the Information, evaluated whether the interventionist ensured that delivery was clear and understood. For example, for the infection/depression intervention, fidelity items focused on "participant describes the principles of infection symptom monitoring" and the "participants discuss strategies to avoid low moods."


The fourth section was Interventionist Overall Competence in Addressing Untoward Discussion during the audiovisual technology-based visit. Ratings on items for the interventionist included "adequately manages interruptions or participants monopolizing the discussion," "adequately addresses participant questions about the approach," and "acknowledges participants' concerns." This section assessed the interventionist's ability to draw participants back onto the information topic after untoward issues arose.


The fifth section of the checklist, Interventionist Effectively Communicates, assessed the interventionist's use of effective communication techniques such as "uses reflective listening" (eg, listens carefully, then restates what the participant is saying to clarify) and "uses emphatic responses" (eg, "I follow you" or "I understand"). Other ratings included whether the interventionist "asks the participant to get comfortable for the next 15 to 60 minutes and at the discussion close" and then "routinely asks the participant if they have any questions."


The sixth and final section, Interventionist Guides Intervention Home Use and Future Technology Session Scheduling, assessed the interventionist's ability to reinforce participants' use of the information, skills, or materials discussed, and maintain scheduling of the technology-based discussions. Items included whether the interventionist "guides participants to select their preferred time of day for use of the intervention in their daily healthcare, whether the interventionist demonstrated "ease of rescheduling future technology-based sessions" (eg, interventionist asked, "What time is good for you? Let us see if we can keep to the weekly schedule").


Items in the sixth section were adapted for the technology used in each study. Likewise, in each study, the total sample (or every patient participating in each study) was observed for the fidelity ratings of the intervention and the technology delivery.


Training Research Teams to Maintain Fidelity

To ensure fidelity of an intervention, it is crucial that all team members are trained in standard research procedures before administering the intervention to participants. For example, each team member must be able to explain encryption, firewall protection, intervention content, the research procedures, and guidelines for participants, as well as demonstrate knowledge and understanding to address technology questions or problems that may arise during an intervention.20 Fidelity training also ensures that procedures for visual projections of slides and/or handouts are coordinated throughout the interventions. Training includes practice responding to possible technical problems during sessions. Additionally, terms used to guide participants' use of the technology device should be defined during the original intervention so as not to confuse with varied wording.21


The interventionists are taught to begin a session by setting "ground rules" for participants about not disclosing personal or health related information or giving medical advice to others during group discussions. The interventionist must learn communication techniques to guide the discussion back to the intervention topic and how to prompt other participants to share if there is one who monopolizes. Duties should be discussed during initial training and reviewed annually to confirm the roles each research team member is to play throughout the intervention. Finally, annual staff training adds to the rigor of the study by ensuring all team members maintain consistent operating knowledge of research regulations and IRB/HIPAA compliance.


Pilot Study Testing Using Fidelity Checklists

The first pilot study included 10 participants who were invited to test analog videophones mailed to their homes in 2010.22 In this study, fidelity checklists also included observations of participants conducting daily IV infusion care. Home care and self-management of IV infusions can be challenging for patients. Few fidelity studies have been conducted to determine the level of detail that interventionists can visually assess using small, plug-in, analog or digital videophone or iPad connections. Thus, these pilot studies tested observation fidelity and whether nurses could clearly see the details of home IV infusions via technology.


Equipment chosen for this study included a one-piece flip-top analog videophone weighing 1.5 lb with a 4-inch color thin film transistor liquid crystal diode active matrix screen and high-resolution (325 K pixels) color charge-coupled device camera and embedded internal speaker. This device allowed two-way video so that the interventionist and participant could see each other simultaneously. The internal 33.6 kilobytes per second modem transmitted the telehealth audio and video signals via Public Switched Telephone Network at a rate of approximately 18 frames per second. This rate allowed visual exchanges that looked similar to television viewing (approximately 30 fps).


The purpose of this pilot study was to determine whether these compact, analog videophones could observe details of IV infusion care procedures conducted by patients at home and to evaluate patient and interventionist satisfaction with the telehealth intervention.


The videophone was placed 8 to 12 inches away from the participants during their daily IV infusions. The nurse interventionist noted any procedural concerns observed and then made suggestions for improvement and maintaining asepsis technique. The fidelity ratings revealed that the interventionist consistently guided the procedures. Notably, these ratings also confirmed that nurses clearly observed participants performing their IV cleansing procedures. For example, the interventionist was able to clearly evaluate participants cleaning the skin around the IV infusion site and 100% of participants correctly covered their IV site without touching, thus maintaining sterile bandaging. Furthermore, for 90% of participants, the interventionist was able to clearly observe participants cleansing their catheter tubing hub connection with antiseptic solution.


However, the ratings also found that there was inconsistency in nurses' ability to observe any appearance of infection around the participant's IV site. Even after instructing participants to move the camera as close as possible to their IV site and directing the lighting, there was not enough visual clarity in all cases to determine presence or absence of infection. Thus, the nurse needed to ask each participant about any redness, inflammation, or swelling at the IV site and if there were any symptoms of discomfort or fever. From this fidelity check, we learned that when video technology is used to assess or support patients, the interventionists should be prepared to ask the patient to describe their experiences and/or symptoms, as well as to conduct a visual assessment. The outcomes of this fidelity testing indicated that technology allowed adequate assessment of some but not all of the details of patient IV home care (Table 3). Importantly for clinical intervention fidelity, improved cameras can zoom in for better inspection.

Table 3 - Click to enlarge in new windowTable 3 Nurse Interventionists' Ability to Clearly Observe Participants' IV Site and Families' IV Infusions Care While Using Audiovisual Technology

Clinical Trials Fidelity Testing

Our first clinical trial using telemedicine equipment included 30 participants. The telehealth units used in this study were connected through residential telephone lines. These telehealth units had small, built-in, two-way cameras that allowed interventionist and patient to see each other. The unit weighed 2.75 lb and was easily installed. The in-home modem transmitted the audio and video signals via single telephone line at 15 frames per second. This speed created a 2- to 3-second delay between speech and reception, which was described as minimal to the participants. This equipment was selected for its technical reliability, portability, and the low cost per unit.23


This clinical trial tested participants' adherence to in-home breathing enhancement treatments.24 The adherence outcome was measured by each participant's breathing assist ventilator timer-recorder. After the telehealth interventions, a higher percentage of intervention participants were adhering to the time prescribed for using their breathing machine. The participants and interventionists completed a technology fidelity survey to collect opinions of the telehealth transmissions between each participant and the nurse interventionist.25


Another clinical trial, this one using iPad technology (Apple, Cupertino, CA), included 126 participants who attended group audiovisual sessions conducted from 2013 through 2016.12 A team of three clinical experts participated in each intervention session from their offices with multiple participants (ranging from two to six participants) from home.14 The iPad technology used in this trial was the 16-GB iPad Mini with Retina Display. Each iPad had a data plan allowing access to our encrypted firewall-protected university medical center file server. Multiple participants and professionals could see one another in separate windows on the iPad screen. This study yielded a cost analysis of implementation for these iPad sessions.26


An additional clinical trial of another nursing intervention began in 2016 and is near completion.27 This trial also uses iPad technology. As in the pilot studies, each clinical trial intervention had standardized information materials consistently provided to the participants.


Fidelity Checklists Data Analyses Summary

The Fidelity Rating Scale scores across each of these studies were calculated for each section and all section scores totaled to get an overall rating. Specifically, the scores of each intervention session were rated into an overall total score, which was later used in the statistical regression analyses for determining how much information each patient was exposed to. Thus, fidelity rating scores were used to control for type III error (when a lack of outcome effect occurs because of an insufficient amount of the intervention being conveyed).6,28 Intervention fidelity scores were also used to calculate the amount of time taken with each participant to control for type II error.29 Researchers used the ratings data from the Fidelity Checklist in each study to discuss consistent delivery of information and best practices for technology delivery throughout each study.



Across all our technology-delivered intervention studies, the overall scores for Interventionists' Technical Competence Ratings (ie, videophone/iPad placement, adequate lighting and volume, session scheduling choice, troubleshooting) ranged from 4 to 4.5 on a 5-point scale. In the iPad studies, technology fidelity was further assessed by an anonymous survey of participants about their use of the technology. A majority completed the survey, and the overall scores were positive regarding technology use.12


The rating scores across studies on following the intervention manual, checking participant intervention comprehension and effective communication, and scheduling were also between 4 and 5. These scores indicate the planned interventions were conducted consistent with the research manuals. Further, the team members consistently delivered the interventions, assessed participants' comprehension, effectively facilitated discussions, communicated well, and guided home intervention use.


Additionally, the data found consistent fidelity observation ratings across all the studies reviewed, regardless of the technology used. Specifically, the data resulted in the conclusions that (1) there was proper use of the videophones/iPads; (2) interventionists demonstrated competence in addressing untoward events such as technical disconnections during the visit; (3) interventionists presented consistent information by following the standardized information scripts and used discussion strategies to engage families; (4) assessment of the participants' comprehension of the information was validated; and (5) reinforcement of participants' ease and use of the information content each day in their home management was affirmed. Furthermore, our studies systematically evaluated the use of the videophones to visually observe patients preparing and completing their daily complex home IV infusions.



Observation data across studies confirmed that training interventionists to deliver consistent healthcare information and technology interaction was successful. One essential aspect of maintaining fidelity is writing a script that guides the intervention content order and how to incorporate graphics or handouts. The script is not a word-for-word document but rather topical, with bolded headings for reminders about the topics to be discussed and the questions to pose during delivery to engage the participants in discussion and encourage sharing.30 A written script is neither to be read nor presented like a lecture, but used as a guide for discussing specific content that participants are to understand and use in their daily home care.31 The script should be practiced aloud for clarity and ease of following the script. Rearrangements can be made to the script flow so that content builds on protection of health (ie, infection prevention). Adjustments can then be made for timing and possible rearranging of topics to ensure a good flow.


Techniques to Ensure Fidelity of Intervention and Technical Delivery of Healthcare

It is important to physically set up the technology the interventionist uses so the script can be easily followed and eye contact with participants maintained during the sessions. Good eye contact aids interventionists in developing rapport and engaging participants. Our psychologist interventionist used dual computer screens during each audiovisual session, with the session script displayed on the desktop computer screen and a second larger screen showing participants in thumbnail images. A dual-lens live-video camera hanging from the ceiling captured a thumbnail image of the interventionist with all other participants on each iPad screen. The camera then projected the script and the live-video thumbnails from each participant onto the wall-mounted 55-inch TV screen, allowing the displayed session script to be slightly enlarged and aiding in simultaneous visualization of the script and each participant (see Figure 1.

Figure 1 - Click to enlarge in new windowFIGURE 1. Elements of successful electronic screening integration.

The interventionist sat at a desk approximately 12 feet from the wall-mounted screen with the high-definition camera mounted to the ceiling with an extension pole between the interventionist and the screen minimizing view obstructions and allowing the interventionist to easily read and scroll through the script while maintaining eye contact and visual engagement with participants (see Figure 1). The two white, sheet-like cloths hanging in the photograph were arranged for lighting so that clarity was achieved in video capture. All participants were able to see all other participants and the interventionist during each session.


A multidisciplinary team approach was developed to deliver the intervention while engaging participants in discussions and monitoring participant response. In one study, three professionals (a nurse, physician, and psychologist) were involved in leading the intervention discussion.18 In another study, our psychologist led the delivery of the scripted intervention, while another interventionist monitored participants for discussion engagement. This team member clarified participant responses to address any concerns, maintained pace of the session for the time allotted, and asked supportive questions based on observations of participant reactions (ie, discomfort, emotional reaction, fatigue, and possible illness). This approach was essential to maintain engagement and participant "uptake" of the information conveyed in the intervention.


Finally, our researchers acknowledge that the high fidelity ratings came from team preparation. Technical specialists, although most often unseen by participants, were essential in maintaining fidelity in technology delivery. These technical staff experts worked to ensure clarity of slide presentations and resources to display across the technology delivery. The intervention scripts promoted timely delivery of visual aids by technical staff throughout intervention sessions.


While analyzing the ratings data during each study, the researchers found that participant engagement in discussion was facilitated by each interventionist in a number of ways. One method included pausing to give participants time to contribute. Interventionists used feedback from the participants to adjust future information sessions to meet the needs of the group. Importantly, the interventionists developed their ability to encourage the participants' home healthcare activities. This technology delivery has been highly rated by participants as beneficial for reaching out and following up about their care without travel and waiting room time.


Implications for Future Research

Future research should be carried out using Intervention Fidelity Checklists to determine whether technology-delivered interventions are followed consistently and to ensure that delivery also meets fidelity.32,33 In addition, recent articles have discussed maintaining fidelity to ensure that there is transparency of research procedures so that these can be used in other studies.34 Moreover, there are recommendations for systematically testing fidelity when assessing interventions that are tailored or individualized for patients.1,4 Further studies with the ever-improving technologies used in telehealth and mHealth are needed to establish sustainability of using devices to consistently deliver interventions. Using our specific checklists to observe and then rate the fidelity of intervention information and delivery via technology has ensured consistency, an essential component of rigorous research. These checklists will be shared upon request.



The authors are grateful to the University Medical Center for Telemedicine and to Dedrick Hooper and Jeremy Ko for their technical expertise in establishing each technology session. The authors extend their appreciation to all who participated in these studies for their time, use of technology, and shared opinions and evaluation of technology-delivered healthcare information. Stark Wright is acknowledged for his photographic skills.




1. Perez D, Van der Stuyft P, Zabala MC, et al. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implementation Science. 2016;11(1): 91. doi:. [Context Link]


2. Roen K, Arai L, Roberts H, Popay J. Extending systematic reviews to include evidence on implementation: methodological work on a review of community-based initiatives to prevent injuries. Social Science & Medicine. 2006;63(4): 1060-1071. [Context Link]


3. Bosak KA, Pozehl B, Yates B, et al. Challenges of applying a comprehensive model of intervention fidelity. Western Journal of Nursing Research. 2012;34(4): 504-519. [Context Link]


4. Calthorpe RJ, Smith S, Gathercole K, Smyth AR. Using digital technology for home monitoring, adherence and self-management in cystic fibrosis: a state-of-the-art review. Thorax. 2019;75: 72-75. doi:. [Context Link]


5. Murphy SL, Gutman SA. Intervention fidelity: a necessary aspect of intervention effectiveness studies. American Journal of Occupational Therapy. 2012;66: 387-388. doi:. [Context Link]


6. Bellg A, Resnick B, Minicucci DS, et al. Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH behavior change consortium. Health Psychology. 2004;23(5): 443-451. doi:. [Context Link]


7. Santacroce SJ, Maccarelli LM, Grey M. Intervention fidelity. Nursing Research. 2004;53(1): 63-66. [Context Link]


8. Ross VM, Smith CE. Longitudinal trends in quality of life after starting home parenteral nutrition: a randomized controlled study of telemedicine. Clinical Nutrition. 2008;27(2): 314-314; author reply 316. [Context Link]


9. Piamjariyakul U, Smith CE. Telemedicine utilization reports and evaluation. In: Montoney L, Gomez C, eds. Telemedicine in the 21st Century. Hauppauge, NY: NOVA Science Publishers; 2008. [Context Link]


10. Lee CYS, August GJ, Realmuto GM, Horowitz JL, Bloomquist ML, Klimes-Dougan B. Fidelity at a distance: assessing implementation fidelity of the early risers prevention program in a going-to-scale intervention trial. Prevention Science. 2008;9(3): 215-229. [Context Link]


11. Munro CL, Savel RH. Rigor and reproducibility in critical care research. American Journal of Critical Care. 2007;26(4): 265-267. doi:. [Context Link]


12. Smith CE, Werkowitch M, Yadrich DM, Thompson N, Nelson EL. Identification of depressive signs in patients and their family members during iPad-based audiovisual sessions. CIN: Computers, Informatics, Nursing. 2017;35(7): 352-357. [Context Link]


13. Smith CE, Leenerts MH, Gajewski BJ. A systematically tested intervention for managing reactive depression. Nursing Research. 2003;52(6): 401-409. [Context Link]


14. Smith CE, Spaulding R, Piamjariyakul U, et al. mHealth clinic appointment PC tablet: implementation, challenges and solutions. Journal of Mobile Technology in Medicine. 2015;4(2): 21-32. [Context Link]


15. McGrew JH, Griss ME. Concurrent and predictive validity of two scales to assess the fidelity of implementation of supported employment. Psychiatric Rehabilitation Journal. 2005;29(1): 41-47. [Context Link]


16. Smith CE, Cha JJ, Kleinbeck SV, Clements FA, Cook D, Koehler J. Feasibility of in-home telehealth for conducting nursing research. Clinical Nursing Research. 2002;11(2): 220-233. [Context Link]


17. Nuro K, Maccarelli L, Baker SM, Martino S, Rounsaville BJ, Carroll KM. Yale Adherence and Competence Scale (YACSII) Guidelines. Vol. 161. ed. West Haven, CT: Yale University Psychotherapy Development Center; 2005. [Context Link]


18. Smith CE, Piamjariyakul U, Werkowitch M, et al. A clinical trial of translation of evidence based interventions to mobile tablets and illness specific Internet sites. International Journal of Sensor Networks and Data Communications. 2016;5(1-7). [Context Link]


19. Carroll C, Patterson M, Wood S, et al. A conceptual framework for implementation fidelity. Implementation Science. 2007;2(40). [Context Link]


20. Yadrich DM, Fitzgerald SA, Werkowitch M, Smith CE. Creating patient and family education Web sites: assuring accessibility and usability standards. CIN: Computers, Informatics, Nursing. 2012;30(1): 46-54. PMCID: PMC3673715. [Context Link]


21. Fitzgerald SA, Yadrich D, Werkowitch M, Piamjariyakul U, Smith CE. Creating patient and family education Web sites: design and content of the home parenteral nutrition family caregivers Web site. CIN: Computers, Informatics, Nursing. 2011;30(1): 46-54. doi:. [Context Link]


22. Spaulding R, Smith CE, Piamjariyakul U, Fitsgerald S, Yadrich D, Prinyarux C. Configuring mHealth devices for secure patient home care and research. Featured interview presented at: International mHealth Scientific Conference; 2013; Washington, DC. [Context Link]


23. Smith E, Cha J, Puno F, MaGee J, Bingham J, Van Gorp M. Quality assurance processes for designing patient education Web sites. CIN: Computers, Informatics, Nursing. 2002;20(5): 503-512. [Context Link]


24. Smith CE, Dauz E, Clements F, Werkowitch M, Whitman R. Patient education combined in a music and habit-forming intervention for adherence to continuous positive airway (CPAP) prescribed for sleep apnea. Patient Education and Counseling. 2009;74(2): 184-190. [Context Link]


25. Smith CE, Dauz ER, Clements F, et al. Telehealth services to improve nonadherence: a placebo-controlled study. Telemedicine Journal and E-Health. 2006;12(3): 289-296. [Context Link]


26. Kim H, Spaulding R, Werkowitch M, et al. Costs of multidisciplinary parenteral nutrition care provided at a distance via mobile tablets. Journal of Parenteral and Enteral Nutrition. 2014;38(2_suppl): 50S-57S. [Context Link]


27. Nelson EL, Yadrich DM, Thompson N, et al. Telemedicine support groups for home parenteral nutrition users. Nutrition in Clinical Practice. 2017;32(6): 789-798. [Context Link]


28. Sidani S. Measuring the intervention in effectiveness research. Western Journal of Nursing Research. 1998;20(5): 621-635. [Context Link]


29. Lipsey MW. Design Sensitivity: Statistical Power for Experimental Research. Vol. 19. Newbury Park, CA: Sage; 1990. [Context Link]


30. Carroll KM, Nich C, Sifry RL, et al. A general system for evaluating therapist adherence and competence in psychotherapy research in the addictions. Drug and Alcohol Dependence. 2000;57(3): 225-238. [Context Link]


31. Duaz E, Moore J, Smith CE, Puno F, Schagg H. Installing computers in older adults' homes for patient education Website: a systematic approach. CIN: Computers, Informatics, Nursing. 2004;22(5): 1-7. [Context Link]


32. O'Brien RA. Translating a research intervention into community practice: the nurse family partnership. The Journal of Primary Prevention. 2005;26(3): 241-257. [Context Link]


33. Dumas JE, Lynch AM, Laughlin JE, Smith EP, Prinz RJ. Promoting intervention fidelity. Conceptual issues, methods, and preliminary results from the EARLY ALLIANCE prevention trial. American Journal of Preventive Medicine. 2001;20(1): 38-47. [Context Link]


34. French CT, Diekemper RL, Irwin RS, et al. Assessment of intervention fidelity and recommendations for researchers conducting studies on the diagnosis and treatment of chronic cough in the adult: CHEST guideline and expert panel report. Chest. 2015;148(1): 32-54. doi:. [Context Link]