Authors

  1. Hensel, Desiree PhD, RN, PCNS-BC, CNE, CHSE
  2. Cifrino, Sheryl PhD, DNP, RN, CHSE
  3. Conover, Katherine-Marie PhD, RN

Article Content

Nursing programs experienced a profound paradigm shift when health care facilities across the nation discontinued student clinical experiences because of COVID-19 (coronavirus disease 2019). Once our private, liberal arts college, located in the New England region, made the difficult decision to close the campus, spring break was extended to give faculty 1 week to migrate all coursework online. Like other nursing programs, we were challenged to rapidly identify alternative methods to help students meet clinical competencies necessary for graduation and ultimately entry into the workforce. Several professional organizations and private vendors provided free resources to help instructors move clinical practice to a virtual format. Early options we considered included piecing together multiple short learning activities from different sources, having students independently complete case studies with submission for grading, and using vendor-created online simulations.

 

While all options included active learning, these methods seemed fragmented and lacked the best educational practices of collaboration and faculty-student interaction. Our nursing program decided to take a cohesive approach to virtual clinical education and use this opportunity to focus on teaching clinical judgment consistent with the National Council of State Board of Nursing (NCSBN) clinical judgment assessment model.1 The NCSBN defines clinical judgment as an observable outcome of engagement in thinking and decision making.1 As simulation and unfolding cases are approaches advocated in the literature to teach clinical judgment,2,3 we decided to use the National League for Nursing (NLN) Advancing Care Excellence (ACE) for vulnerable populations unfolding cases as the foundation for a standardized approach to virtual clinical education.4,5

 

Creating the Online Modules

The ACE unfolding case studies are part of a continuing initiative by the NLN to improve quality of care for the vulnerable populations of Alzheimer's disease, pediatrics, veterans, disabilities, seniors, and caregivers.4 The ACE framework integrates the essential knowledge domains of individualized care, complexity of care, and vulnerability of transitions with the essential nursing actions of assessment of function and expectation, coordination and management of care, using evidence-based knowledge, and making situational decisions to improve care outcomes.5 The free resources for each unfolding case include a patient monologue, 3 simulations scenarios, patient charts, a "finish the story" reflective assignment, and faculty teaching toolkits.4

 

Based on the emerging evidence that 1 hour of simulation can replace 2 hours of traditional clinical education,6 we selected 9 unfolding cases to build into a 6-hour synchronous format meant to replace a 12-hour clinical day. Over the course of 3 days, we filmed 3 videos for each of the 9 cases using mostly high-fidelity manikins for patients and a small group of faculty as nurses and family members. Video segments ranged from 10 to 27 minutes. Videos were filmed in 1 attempt, complete with mistakes. The clinical day modules were built in the colleges' learning management system (LMS) with prework; the patient monologues; the 3 videos; the Situation, Background, Assessment, Recommendation (SBAR) reports; patient charts; a reflective assignment; and a debriefing evaluation survey. Each module also included a folder of faculty resources that was hidden from student view. Clinical coordinators selected the cases based on clinical objectives to be used for the sophomore, junior, and senior students.

 

The clinical days were designed to align with the best practices from the International Nursing Association for Clinical Simulation and Learning.7 The clinical day began with groups joining via the LMS collaborate function. The prebriefing started by listening to the monologue followed by a discussion about the client's needs. The groups then reviewed the SBAR report, watched the video, and debriefed the simulation. The process was repeated for the second and third simulations. Breaks and authentic activities such as writing an SBAR report, documenting, and looking up a procedure prep were built into the day. The clinical experience concluded with reflective journaling, next steps discussions, and completion of an online student evaluation of the simulation experience.

 

Faculty Training

Faculty attended a 4-hour training on the days that they were originally scheduled to teach in the clinical setting. One of those sessions was recorded so faculty could relisten to content as needed. Faculty were given an outline of how the day should progress. We emphasized how faculty would be teaching thinking skills and that even though this was not a traditional clinical day, instructors should have high expectations that students would receive a good learning experience. The training involved 4 parts: teaching clinical judgment, using the LMS collaborate function, facilitating the ACE cases, and conducting a debriefing.

 

As using a structured debriefing helps improve learning,8 the faculty training included how to use the Gather, Analyze, and Summarize debriefing approach. This method of debriefing is a simple approach that is based on listening to learners during the gather phase, facilitating guided reflection on the experience during the analysis phase, and assisting learners to form conclusions based on examination of their experience and how they relate this to this clinical situation coupled with their approaches for future clinical actions.8 The training included how to infuse the 6 steps of the NCSBN clinical judgment assessment model into the analysis component of the debriefing.1 Faculty reviewed a list of suggested prompts to facilitate thinking and make thought processes visible as students worked through recognizing and analyzing cues, prioritizing hypothesis, generating solutions, taking action, and reflecting how actions impacted patient outcomes.3 The list was also placed in the faculty resource section of each module.

 

Outcomes

This initiative by the School of Nursing allowed continuation of clinical instruction in a virtual platform. Over 4 training sessions, a total of 14 full-time and 36 part-time faculty learned how to provide alternative clinical education in a standardized manner. During the next 7 weeks, there were 1413 student encounters reported as number of students in a clinical, ranging from 6 to 10, multiplied by the total number of clinical offerings. Informal school-wide meetings revealed initial reluctance from faculty and students about online clinical education. In subsequent meetings, both groups reflected that the experience was better than anticipated. Faculty mentors intermittently joined sessions and found students engaged in robust discussion. Some faculty shared that the approach really helped them understand students' thinking processes. The faculty feedback indicated that the major areas for improvement were that there was significant amount of overlapping content in some of the selected cases, and it was difficult to keep students engaged for an entire 6-hour day.

 

We collected quantitative evaluation data after each clinical day from all participants using the Debriefing Assessment for Simulation in Healthcare (DASH) tool.9 This reliable and valid tool, based on how people learn and change in experiential situations, evaluates instructor use of approaches in debriefings.10 The items are rated from 1 (extremely ineffective/ detrimental) to 7 (extremely effective/outstanding). The first element assesses the beginning of the simulation-based exercise and how well the instructor orients, explains, and defines learning that will take place. Elements 2 through 6 assess the debriefing process including behaviors related to maintaining a situation for engagement in learning, organization of debriefing, facilitation of reflection and feedback among learners about thoughts and associated actions, and transfer of learning for future clinical situations.

 

Sample data from 106 participants in the NLN ACE Judy Morales and Lucy Gray caregiver unfolding case showed an overall DASH mean scale score as 6.57 +/- 0.87. Responses to 2 items that indicated the simulations helped develop clinical judgment were as follows: "In the middle, the instructor helped me analyze actions and thought processes as we reviewed the case" (mean, 6.6 +/- 0.84), and "At the end of the debriefing, there was a summary phase where the instructor helped tie observations together and relate the case(s) to ways I can improve my future clinical practice" (mean, 6.62 +/- 0.84). Evaluation data showed similar results for the other 8 cases.

 

Conclusion

Professional simulation organizations endorse the liberal use of virtual simulation to meet learning objectives during the pandemic.11 Our plan, which can be easily replicated in other programs, helped students gain clinical judgment skills and a better understanding of the complexities of vulnerable populations. After traditional clinical education resumes, faculty plan to keep using the ACE cases, and creating a more formalized system of using these is in the works. While the ACE cases are excellent teaching tools, the possibility exists that programs can create new unfolding cases to teach curricular exemplars. This teaching method can also be replicated without the video, where scripts and audio clips allow the story to be told. Moving forward, we will be looking to find the right balance of traditional clinical education, face-to-face simulation, and virtual clinical education for our nursing program.

 

References

 

1. National Council of State Boards of Nursing. Next generation NCLEX news winter 2019. Available at https://ncsbn.org/13342.htm. Accessed June 5, 2020. [Context Link]

 

2. Dickison P, Haerling KA, Lasater K. Integrating the National Council of State Boards of Nursing clinical judgment model into nursing educational frameworks. J Nurs Educ. 2019;58(2):72-78. doi:. [Context Link]

 

3. Hensel D, Billings DM. Strategies to teach the National Council of state boards of nursing clinical judgment model. Nurs Educ. 2020;45(2):128-132. doi:. [Context Link]

 

4. National League for Nursing. Advancing care excellence series. 2020. Available at http://www.nln.org/professional-development-programs/advancing-care-excellence-s. Accessed June 5, 2020. [Context Link]

 

5. Tagliareni EM. Teaching With ACE.S: A Faculty Guide. National League for Nursing; 2016. [Context Link]

 

6. Sullivan N, Swoboda SM, Breymier T, et al. Emerging evidence toward a 2:1 clinical to simulation ratio: a study comparing the traditional clinical and simulation settings. Clin Simul Nurs. 2019;30(C):34-41. doi:. [Context Link]

 

7. INACSL Standards Committee. INACSL Standards of Best Practice: SimulationSM. Accessed June 5, 2020. Available at https://www.inacsl.org/inacsl-standards-of-best-practice-simulation. [Context Link]

 

8. Lee J, Lee H, Kim S, et al. Debriefing methods and learning outcomes in simulation nursing education: a systematic review and meta-analysis. Nurs Educ Today. 2020;87:104345. doi:. [Context Link]

 

9. Center for Medical Simulation. Debriefing Assessment for Simulation in Healthcare (DASH) student version(C). 2010. Available at https://harvardmedsim.org/wp-content/uploads/2017/01/DASH.SV.Short.2010.Final.pd. Accessed June 5, 2020. [Context Link]

 

10. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012;7(5):288-294. doi:. [Context Link]

 

11. INACSL and Society for Simulation in Healthcare. Position statement on use of virtual simulation during the pandemic. March 31, 2020. Available at https://www.ssih.org/Portals/48/2020/INACSL_SSH%20Position%20Paper.pdf. Accessed July 1, 2020. [Context Link]