Keywords

clinical judgment, experiential learning, nursing education, nursing students, simulation, simulation design

 

Authors

  1. Chmil, Joyce Victor PhD, RN-BC, CHSE
  2. Turk, Melanie PhD, RN
  3. Adamson, Katie PhD, RN
  4. Larew, Charles PhD, RN

Abstract

Simulation design should be theory based and its effect on outcomes evaluated. This study (1) applied a model of experiential learning to design a simulation experience, (2) examined how this design affected clinical nursing judgment development, and (3) described the relationship between clinical nursing judgment development and student performance when using the experiential learning design. Findings suggest that using an experiential learning simulation design results in more highly developed nursing judgment and competency in simulation performance.

 

Article Content

Simulation-based activities are being added increasingly to replace traditional clinical hours in nursing programs.1 During a simulation activity, prelicensure nursing students engage in an experiential learning process with the goal of stimulating cognitive, metacognitive, psychomotor, and affective domains.2 This experiential learning process incorporates the essential concepts of nursing and the nursing process to facilitate the development of clinical nursing judgment and the provision of competent evidence-based practice.3 Although various theoretical and conceptual frameworks have been used to develop simulation experiences,4 simulation design in nursing is not adequately theory based.5

 

The purposes of this study were to (1) design a simulation experience based on Kolb's Model of Experiential Learning, (2) examine how this theory-based design affects the development of clinical nursing judgment in prelicensure, baccalaureate nursing students, and (3) describe the relationship between clinical nursing judgment development and student performance when the simulation design is fully based on an experiential learning model.

 

Review of the Literature

In his Theory of Experiential Learning, Kolb6,7 presented a model for learning that includes 4 elements, or essential components: abstract conceptualization, active experimentation, concrete experience, and reflective observation. Abstract conceptualization involves knowledge and logic.6 Active experimentation involves the application of knowledge to a situation to plan interventions.6 Concrete experience is the engagement of the learner in activities and observable behaviors, and reflective observation involves self-evaluation that links the expected to the actual outcomes and builds new knowledge.6

 

Much of the research guiding the science of simulation in nursing used a simulation design based on the Nursing Education Simulation Framework3 and focused on evidence that supported either the techniques for implementing the concrete experience of a simulation scenario8-10 or methodologies for facilitating reflective observation and debriefing.11 There was little to no evidence supporting structured abstract conceptualization or active experimentation activities in simulation design in nursing; yet, these are 2 fundamental elements of experiential learning.

 

According to Kolb's theory, experiential learning is dependent not only on the inclusion of all elements of learning but also on the learner's awareness of these elements.6 This awareness of the elements, as well as strategic application of knowledge within the elements, relies on metacognition.10 Metacognition is defined as the conscious awareness of learning.12 It is an essential component of clinical nursing judgment, or one's ability to think in terms of nursing process.13 Clinical judgment can be developed and assessed only when there are observable behaviors allowing for the evaluation of level of mastery in cognitive, psychomotor, and affective domains.11 These behaviors or actions allow for both evaluation by the instructor and self-evaluation by the student.

 

For assessment of an individual's judgment and reasoning skills, a systematic review of current research supported individual evaluation of the learner14 as well as self-evaluation to promote further development of clinical nursing judgment.15 Because the evaluation of clinical nursing judgment development relies on the observation of behaviors, there may be a link between judgment and performance. The National Council of State Boards of Nursing used the term simulation performance to describe the student's abilities to meet set experiential learning outcomes.16

 

Experiential Learning Simulation Design for This Study

The first step in developing the experiential learning design for this study was to build on the traditional simulation design, which has focused on the concrete experience of the clinical scenario and the reflective observations of debriefing. Our aim was to create a new design that incorporated all 4 elements of experiential learning. The Experiential Learning Simulation Design was based on Kolb's Experiential Learning Model12 and 2 premises of Kolb's Experiential Learning Theory.6,7 First, for experiential learning to be most effective, all elements of the learning cycle must be included in the educational experience, and second, the learner must be aware of and actively involved in activities for each of the elements.7 The common practice of using unstructured, independent activities before the simulation experience17 does not adhere to these premises.

 

The use of structured activities for the experiential learning phases of conceptualization and experimentation, in addition to the concrete experience and reflective observation, brings all processes of experiential learning into consciousness for the learner.7 Active participation in, and consciousness of, each phase of the experiential learning cycle provides the learner with a means to link preexperience expectations, concrete simulated experience, and postexperience reflection. The Experiential Learning Simulation Design incorporated each element of experiential learning into a distinctly separate yet interdependent activity (Figure).

 

Methods

Research Design

This quasi-experimental research design used a convenience sample of current students as the experimental group and a historical sample of students who completed the same course during the previous year as the control group. Although both groups completed the same structured scenario and debriefing activities, the control group completed this simulation experience using independent briefing activities, whereas the experimental group completed the simulation experience with structured briefing activities. This quasi-experimental design was chosen for feasibility purposes.

 

Sample

This study was conducted in accordance with the ethical standards of the institutional review boards of one of the author's (J.V.C.) universities. The study included a convenience sample of first clinical semester, prelicensure, baccalaureate nursing students (N = 144) from a private university. The experimental group (n = 72) was a sample of students who completed a simulation experience using an experiential learning design as part of the first clinical nursing course of the program. In accordance with the course syllabus, student participation in the simulation experience was mandatory and included a summative evaluation. However, consent for use of deidentified data for research purposes was obtained. There were 80 students enrolled, 75 who met eligibility requirements for inclusion, and 72 who consented to participate as the experimental group.

 

The control group (n = 72) was established from a database of previous students who completed the simulation experience using the traditional simulation design. This simulation experience was integrated into the same first clinical nursing course and was conducted at the same point in the program as for the experimental group.

 

The performing and debriefing phases of the simulation experience used the same scenario and debriefing methods, the same raters, and the same faculty and simulation staff for both groups. Each student in both the control and experimental groups completed the simulation experience individually. Any student who had previous clinical experience was ineligible for the study. This sample was chosen for feasibility and for control of extraneous variables such as curriculum design, clinical experience, and faculty for the foundations course, as between-group differences in any of these variables could have confounded results.

 

Variables and Instruments

For the purposes of this study, traditional simulation design refers to a learning experience traditionally used at the institution where this study was conducted and is based on the Nursing Education Simulation Framework.18 This traditional simulation design consisted of activities in which learners independently completed unstructured activities for thinking and planning before a 30-minute activity for performing and a 30- to 60-minute activity for debriefing. The experiential simulation design refers to a learning experience based on Kolb's Experiential Learning Model, which consisted of structured, instructor-facilitated activities. The experience included a 15-minute thinking activity and a 15-minute planning activity for briefing, immediately followed by a 30-minute performing activity and a 30- to 60-minute debriefing activity.

 

Clinical nursing judgment development was measured using the Lasater Clinical Judgment Rubric (LCJR) for both the control and experimental groups during the performing and debriefing phases. The LCJR was also used to guide the debriefing for students in both groups. The LCJR has a reported interrater reliability of 0.889, an intrarater reliability of 0.908, and an internal consistency of 0.974.19 Internal consistency of the LCJR within this study was excellent for the total score (11 items), with a Cronbach's [alpha] of .92. The LCJR is designed to measure clinical nursing judgment development over time.20 Because LCJR scores in this study were obtained in the first clinical simulation of the program, they represent the beginning score of the students' trajectory of clinical nursing judgment development.

 

Simulation performance was the evaluation of behaviors relative to competency, safety, communication, and confidence, as exhibited during the simulation experience, and measured using the Creighton Simulation Evaluation Instrument (C-SEI). This evaluation was conducted only in the experimental group, who completed the simulation experience using the experiential simulation design. An interrater reliability of 0.84 to 0.89 for the C-SEI and an internal consistency was reported with a Cronbach's [alpha] of .98.19,21 Cronbach's [alpha] for the C-SEI in this study was found to be highly reliable (22 items; [alpha] = .91).

 

In this new simulation design, thinking addressed the element of abstract conceptualization and referred to the phase of simulation design that attempted to stimulate the learner to apply knowledge, thought, and logic to the concepts associated with the simulation scenario. For thinking, the control group used peer-reviewed journal articles as reading assignments. These were posted to the course Web site 1 to 2 weeks before the simulation experience and were completed by the student independently, as was policy at the institution where the research was conducted. There were no measures taken to ensure that the student completed these assignments other than to note that the articles were downloaded from the course Web site by each student. The experiential group used a structured computer-based activity to actively engage the learner in the process of applying knowledge, thought, and logic to the concepts associated with the simulated scenario (ie, safety, comfort, infection control) immediately before moving to a planning activity. The questions used on this quiz were taken from the test bank for their Fundamental of Nursing textbook.

 

The element of active experimentation was addressed through a planning activity, since planning involves the processes in which knowledge is applied to a clinical situation.22 In this study, planning referred to the phase of simulation design that allowed the learner to apply knowledge of associated concepts to a patient scenario before providing care. In the control group, the simulated patient's electronic medical record was available 1 to 2 weeks before the concrete experience via the course Web site, and students accessed this record independently, as was policy at the institution where the research was conducted. The experiential group used a structured, instructor-facilitated activity-the development of a concept map applicable to the simulation scenario, which was completed immediately before the performing phase.

 

The performing phase addressed Kolb's element of concrete experience and referred to the phase of simulation design in which the learner engaged in the provision of patient care in a simulated clinical environment. This phase was the same for both the control and experimental groups. The patient scenario used in the performing phase was taken from a simulation scenario textbook and was consistent with the objectives of the nursing fundamentals course.23 Specifically, the student will (1) demonstrate ability to maintain a safe patient environment at all times, (2) apply knowledge and skills of postoperative care in the simulated setting, and (3) perform a basic head-to-toe assessment. Because communication and physical assessment skills were evaluated, this simulation scenario used standardized patients, humans acting in the patient role using a standardized script to present the patient case.

 

Debriefing addressed Kolb's element of reflective observation. It refers to the phase of simulation design in which the learner engaged in the evaluation of clinical nursing judgment and performance, with an instructor, immediately after the performing phase. The debriefing phase used the LCJR as a guide. All debriefers in this study were trained on use of the LCJR and had 3 to 5 years of experience in using it for evaluation in the simulation setting. In the debriefing phase, the learner linked expected to actual outcomes to self-evaluate and to further develop clinical nursing judgment. This phase was the same for both the control and experimental groups.

 

Data Analysis

All statistics were analyzed using SPSS 20.0 (IBM, Armonk, New York). There were no significant differences in between-group characteristics of age, gender, and ethnicity. An independent-samples t test, Pearson product-moment correlation coefficient, and linear regression analysis were used. Significance was set at P < .05.

 

Results

Clinical Judgment

The mean LCJR score of students engaged in the experiential learning simulation design (mean [SD], 27.81 [4.84]) was significantly higher than the mean LCJR score of students who were engaged in a traditional design (mean [SD], 20.75 [3.96]) (t142 = -9.57, P < .001). A moderate effect size (0.63) with a power of 0.95 was found using Cohen d.

 

Relationship Between Clinical Judgment and Simulation Performance

Within the experiential learning group, we examined the relationship between clinical nursing judgment development and simulation performance and found this relationship to be positive (r = 0.69) and significant (P < .001). Linear regression analysis of the correlation between the 2 variables further revealed that 47% of the variance in simulation performance (C-SEI score) was associated with clinical nursing judgment development (LCJR score) (R2 = 0.47, F1, 70 = 61.38, P < .001, and t70 = 7.84, P < .001).

 

Interrater Agreement for LCJR and C-SEI

To assess interrater reliability of the instruments used in the study, 16 raters would have been required to be engaged in evaluation over the course of data collection, as 4 simulations ran simultaneously each day. For feasibility and consistency purposes, inter-rater agreement for raters using the LCJR and C-SEI was assessed before the study in pilot simulations using the same scenario and debriefing. Before data collection, the researcher chose raters who had completed training on the respective instruments. Because 4 participants would be completing the simulation experience independently but simultaneously, 4 raters were chosen for each of the 2 instruments.

 

The researchers compared LCJR and C-SEI scores recorded by the rater to the score assigned by the principal investigator for that same student. Using the Guidelines for Reporting Reliability and Agreement Studies,24[kappa] scores were analyzed for each rater. The interrater agreement analysis for the set of raters using the LCJR showed substantial to almost perfect agreement (0.73, 0.89), and for the set of raters using the C-SEI, the analysis showed almost perfect agreement (0.81, 0.84) as per the guidelines by Landis and Koch.25

 

Discussion

A theoretically based, experiential learning simulation design was used in this study to evaluate its effect on clinical nursing judgment development in prelicensure baccalaureate nursing students. Further analysis was conducted to assess for the relationship between clinical nursing judgment development and performance in the simulated clinical setting when a theoretically based simulation design was used. We found significantly higher levels of clinical nursing judgment development in students who completed a simulation experience using an experiential learning design when compared with students who completed a simulation experience using the traditional design. Among the students who completed a simulation experience using an experiential learning design, there was a significant positive relationship between clinical nursing judgment development and simulation performance, with nearly half of the variance in simulation performance accounted for by clinical nursing judgment development.

 

The findings of this study fill 3 identified gaps in the nursing simulation literature. First, the new design provides a theoretical framework for a simulation experience fully based on an experiential learning model. Second, this study supports the use of a theoretically based simulation design for the development, evaluation, and reporting of clinical nursing judgment development. Third, this study demonstrates a significant relationship between clinical nursing judgment development and performance in the simulation setting.

 

Kolb's Model of Experiential Learning6,7 provided an apt model for a simulation design in which the learner actively engages in various activities to address all elements of experiential learning. In current practice, the element of active experimentation is consistently omitted from simulation design.4

 

The theory-based simulation design used in this study created a simulation experience that engaged students in a concrete experience and reflective observation activity and allowed opportunities for the student to assess knowledge, identify expectations, and plan care through the inclusion of activities that involved abstract conceptualization and active experimentation. This design was constructed for and used as a method of individually evaluating students in the simulation setting, a practice that is not common in the nursing simulation literature but is supported in a meta-analysis of simulation design in healthcare disciplines.14

 

The experiential learning simulation design tested in this study supports current evidence by stimulating cognitive, metacognitive, psychomotor, and affective learning to provide a means for development of clinical competencies, in particular, clinical nursing judgment. This new design actively engaged the learner in various activities, providing a strong theoretically based framework for the execution of a simulation experience that consciously and actively assisted in (1) applying concepts relative to the case scenario to create a plan of care, (2) implementing care in a simulation experience, and (3) self-evaluating to link expected and actual outcomes. These structured activities were shown to more effectively develop clinical nursing judgment than unstructured, independent activities and support the Institute of Medicine initiatives, which recommend educating future nurses in the processes of thinking and decision making.26

 

Limitations

An experimental design with random assignment of students would be a more rigorous design; however, the use of historical data to establish the control group did not introduce any identified confounding variables and provided a moderately large sample size. The sample was not completely representative of the population, particularly with regard to age. Also, there was no documentation to ensure students in the control group completed independent reading assignments. It is recommended that the study be replicated using an experimental model with a more structured independent activity for the control group. Fiscal and human resources prohibited the assessment of interrater reliability within the study. Interrater agreement for the raters used in this study was established a priori during a pilot for the simulation scenario used in the study.

 

Conclusions

This research study was designed to create a new theory-based experiential learning simulation design and test its effect on clinical nursing judgment development in prelicensure baccalaureate nursing students. This study applied the 4 elements of Kolb's experiential learning theory to create a simulation design that engaged the learner in structured activities within each of the 4 simulation phases: thinking, planning, performing, and debriefing. Findings suggest that engagement of students in an experiential learning simulation design improved clinical nursing judgment among prelicensure baccalaureate nursing students at the beginning of its trajectory of development. Clinical nursing judgment development accounted for nearly half of simulation performance, as higher LCJR scores were significantly related to higher performance scores. Thus, the use of a simulation design fully based on an experiential learning model better prepared students to perform in the simulation setting than traditional simulation design did.

 

Future research should investigate the experiential simulation design for its effect on readiness for entry into professional nursing practice. Students should also be evaluated once they are engaged in the actual clinical setting, and those scores be examined to further evaluate the relationships between clinical nursing judgment, simulation performance, and actual clinical performance. Finally, valid and reliable instruments need to be developed to measure each phase of this theory-based design to test the design as a model.

 

References

 

1. Hayden JK, Smiley RA, Alexander M, Kardong-Edgren S, Jeffries PR. The NCSBN national simulation study: a longitudinal, randomized, controlled study replacing clinical hours with simulation in prelicensure nursing education. J Nurs Regul. 2014; 5( 2): s1-s64. [Context Link]

 

2. Banning M. Clinical reasoning and its application to nursing: concepts and research studies. Nurs Educ Pract. 2008; 8( 3): 177-183. [Context Link]

 

3. Jeffries PR. Simulation in Nursing Education: From Conceptualization to Evaluation. 2nd ed. Philadelphia, PA: Lippincott Williams & Wilkins; 2012. [Context Link]

 

4. Rodgers D. How simulation works: learning theory and simulation. Paper presented at: 13th Annual International Meeting on Simulation in Healthcare (IMSH); January 29, 2013; Orlando, FL. [Context Link]

 

5. Rourke L, Schmidt M, Garga N. Theory-based research of high-fidelity simulation use in nursing education: a review of the literature. Int J Nurs Educ Scholarsh. 2010; 7( 1): 1548-923X. doi:10.2202/1548-923X.1965. [Context Link]

 

6. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Upper Saddle River, NJ: Prentice-Hall; 1984. [Context Link]

 

7. Kolb DA, Rubin IM, McIntyre J (Eds). Organizational Psychology: An Experiential Approach. Englewood Cliffs, NJ: Prentice Hall; 1999. [Context Link]

 

8. Horn M, Carter N. Practical suggestions for implementing simulations. In: Jeffries PR, ed. Simulation in Nursing Education: From Conceptualization to Evaluation. 1st ed. New York: National League for Nursing; 2007: 59-72. [Context Link]

 

9. Berragan L. Simulation: an effective pedagogical approach to nursing. Nurse Educ Today. 2011; 31( 4): 660-663. [Context Link]

 

10. Bronander K. Modalities of simulation. 2011. Available at http://www.medicine.nevada.edu/ofd/documents/IPEWorkshopModalities1.pdf. Accessed January 3, 2013. [Context Link]

 

11. Mariani B, Cantrell MA, Meakim C, Prieto P, Dreifuerst KT. Structured debriefing and students' clinical judgment abilities. Clin Sim Nurs. 2012; 9( 5): e147-e155. [Context Link]

 

12. Flavell JH. Metacognition and cognitive monitoring: a new area of cognitive-developmental inquiry. Am Psychol. 1979; 34( 10): 906-911. [Context Link]

 

13. Tanner CA. Thinking like a nurse: a research-based model of clinical judgment in nursing. J Nurs Educ. 2006; 45( 6): 204-211. [Context Link]

 

14. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach. 2013; 35( 1): e867-e898. [Context Link]

 

15. Lasater K. Clinical judgment: the last frontier for evaluation. Nurs Educ Pract. 2011; 11( 2): 86-92. [Context Link]

 

16. Li S. The role of simulation in nursing education: a regulatory perspective. 2007. Available at: https://www.ncsbn.org/Suling2.ppt. Accessed January 24, 2015. [Context Link]

 

17. Chase-Cantarini CS, Scheese C. ODIUM-creative pre and post simulation activities. Paper presented at: 13th Annual International Meeting on Simulation in Healthcare (IMSH); 2013; Orlando, FL.[Context Link]

 

18. Jeffries PR. A framework for designing, implementing, and evaluating simulations used as teaching strategies in nursing. Nurs Educ Perspect. 2005; 26( 2): 96-103. [Context Link]

 

19. Adamson KA. Assessing the Reliability of Simulation Evaluation Instruments Used in Nursing Education: A Test of Concept Study [dissertation]. Pullman, WA: Washington State University; 2011 Accessed December 26, 2014. [Context Link]

 

20. Lasater K. Clinical judgment development: using simulation to create an assessment rubric. J Nurs Educ. 2007; 46( 11): 496-503. [Context Link]

 

21. Todd M, Manz JA, Hawkins KS, Parsons ME, Hercinger M. The development of quantitative evaluation tool for simulations in nursing education. Int J Nurs Educ Scholarsh. 2008; 5( 1): 1-17. [Context Link]

 

22. Banning M. Clinical reasoning and its application to nursing: concepts and research studies. Nurse Educ Pract. 2008; 8( 3): 177-183. [Context Link]

 

23. Hale TJ, Ahlschlager PM. Simulation Scenarios for Nursing Education. 1st ed. Independence, KY: Cengage Learning; 2010. [Context Link]

 

24. Kottner J, Audige L, Brorson S, et al. Guidelines for reporting reliability and agreement studies (GRRAS). J Clin Epidemiol. 2010; 64( 1): 96-106. [Context Link]

 

25. Landis JR, Koch GG. The measurement of observer agreement for categorical data. Biometrics. 1977; 33: 159-174. [Context Link]

 

26. Institute of Medicine (IOM). The Future of Nursing: Focus on Education [Report Brief]. Washington, DC: National Academies; 2010. [Context Link]