Keywords

NCLEX-RN, NCLEX-PN, Nursing exams, Evaluation, Curriculum evaluation

 

Authors

  1. LAUCHNER, KATHRYN A. PhD, RN
  2. NEWMAN, MARY PhD, RN
  3. BRITT, ROBIN B. EdD, RN

Abstract

This study was designed to determine the accuracy of computerized comprehensive nursing exams, HESI Exit Exams (E2s), in predicting registered nurse and practical nurse students' success on the licensing exam. Schools of nursing that administered E2s during the academic year 1996-97 were surveyed to determine how many students (n = 2809) predicted by the E2 to pass the licensure exam had failed, and if the exam administration was monitored or proctored. Based on the findings of this study, the E2 was determined to be an accurate predictor of students' success on the licensing exam. However, it was significantly more accurate (P = .05) when the exam was monitored (99.49%) than unmonitored (96.82%). The E2 was determined to be highly predictive of students' success on the licensing exam for all groups tested: associate degree, baccalaureate, diploma, and practical nursing students.

 

Article Content

Nurse licensure exams, the NCLEX-RN and the NCLEX-PN, are high-stake exams for students, nursing faculty, and college and university administrators. Failure of the exam has financial as well as emotional consequences for students. Some nursing employment opportunities are eliminated by such failures, and those who fail may experience a sense of low self-esteem.1 The licensure exam measures minimal level competence and is not designed to be a curriculum evaluation tool; however, faculties often use the school's pass rate on the licensure exam as one curriculum outcome. College and university administrators are interested in marketing high pass rates as a means to attract students into their nursing programs.

 

Because of the value attributed to success on the licensure exam, it is important for students and faculty to have a means of determining students' preparedness for the licensure exam so that if remediation is indicated, it can be initiated before graduation. Computerized testing can provide immediate feedback regarding students' risk for failing the licensing exam. Further, computerized instruments can serve as learning tools for those preparing for the computerized licensing exams. Billings et al1 and Riner et al2 reviewed computerized NCLEX-RN preparation programs and described commercially prepared computerized instruments based on numerous criteria. Their publications were descriptive and not intended to address the effectiveness of such instruments in predicting students' success or in preparing students for the licensure examination.

 

Many authors have described factors related to NCLEX success.3-7 Some also have studied the relationship of scores on commercially available testing products to students' success on the licensing exam.7-11 The Mosby Assess Test has been found to be a moderate to strong predictor of NCLEX-RN success.8,10,12 However, the Mosby Assess Test is a paper-and-pencil exam and results are not immediately available. Barkley et al9 studied 81 students from one baccalaureate school of nursing and found a significant relationship between students who passed the NCLEX-RN and their scores on the Psychiatric, Obstetric, Pediatric, and Adult National League for Nursing (NLN) achievement tests. But, again, the NLN achievement tests were paper-and-pencil tests and results were not immediately available. Ross et al11 examined the usefulness of the Computer Assisted Preparation for the NCLEX-RN (1992) in assessing students at risk of failing the NCLEX-RN. Their findings were based on 230 volunteer participants from one baccalaureate school of nursing, and they concluded "further research is necessary to establish evidence of the validity on the Computer Assisted Preparation for the NCLEX-RN in predicting NCLEX results."11

 

The need exists to study the predictive accuracy of a computerized exam that can provide immediate feedback. Previous studies often focused on students from one type of educational preparation, and from one geographic area. Therefore, to broaden the inferential range, the sample needed to include nursing students from different nursing educational preparations and from different parts of the country. This study was designed to measure the effectiveness of a computerized instrument in determining registered nurse (RN-associate degree, baccalaureate, and diploma) and practical nurse (PN) students' preparedness for the licensure exam in the United States. Additionally, the effect of monitoring or proctoring the administration of the test was examined.

 

INSTRUMENT SELECTION

When selecting an instrument for study, several criteria were identified as necessary. First, the researchers wanted to investigate the validity of a computerized exam because such an instrument could provide immediate feedback of students' scores so that remediation could begin immediately if indicated. It was deemed important that the exam was similar to the NCLEX, one that used the same NCLEX-RN or NCLEX-PN test blueprint as the National Council of State Boards of Nursing,13,14 and one that had a similar screen layout and keyboard usage as the NCLEX. Finally, cumulative reports should provide information that might be useful in evaluating nursing curriculum.

 

The HESI Exit Exam (E2), developed by Health Education Systems, Inc. (HESI), was selected for this investigation for several reasons. First, it could be used for curriculum outcome evaluation. Second, the test items contained on the E2 were application and analysis level questions,15 and the model described by Morrison et al16 for writing critical thinking test items was used to develop test items for the E2. Third, students were able to view rationales for items missed immediately after completing the exam, thus providing a learning experience. Fourth, scores reported on the E2 compared the student's responses on more than 50 different subject categories with all students who had previously answered the test items contained on the E2. Finally, the E2 was selected for study because HESI agreed to provide a listing of all schools that had administered the E2 during the previous academic year (1996-1997), as well as all item analysis data obtained from each administration of the E2.

 

INSTRUMENT DESCRIPTION

An in-depth description of the instrument's development was obtained from HESI promotional materials, computer printouts of item analyses, and information provided by company spokespersons. Development of the E2 was described as an ongoing process. ParSYSTEM, a commercially available test banking and item analysis computer program distributed by Scantron Corporation, was used for item analysis and test item banking. Based on data obtained from the item analysis of each exam administered, test items were revised by a team of nursing education and nursing practice experts.

 

Each school group that administered an E2 was analyzed using ParSYSTEM's test analysis program. The reliability coefficient, the Kuder-Richardson Formula 20 (KR20), for each group was calculated within ParSYSTEM and described in the test analysis report. These reports were reviewed and data were tabulated for all administrations of the E2 during the academic year studied, 1996-1997. Fourteen different E2s were administered to 80 groups at 62 schools. The KR20 for these administrations ranged from 0.34 to 0.91, and the average KR20 for the 80 administrations was 0.85. Overall item analysis data, including item difficulty level (P value) and discrimination data (point biserial correlation coefficient), were calculated for each administration of the E2. These data were accumulated and stored within the item banking program of ParSYSTEM for 7 years, from 1990 to 1997. Additional items were developed throughout this 7-year period, and new test items were piloted with the administration of each E2. Students' were unaware of which items were pilot items, and the piloted items did not count toward the students' score. Data from all uses of all items provided the normative data for the E2. A mathematical formula for predictability was developed by HESI, and is considered by the company to be proprietary. This predictability model was applied to each test as a whole, as well as within each subject area tested. The application of this model was implemented by HESI's testing program, SIMCAT. This program permitted comparison of the student taking the test with all students who had previously answered the same test items. Additionally, the SIMCAT program performed certain correlations, which were components of the predictability model and were used to ensure accuracy of predictions made by the E2.

 

Using the test blueprint of the National Council of State Boards of Nursing,13,14 test items were selected for E2s from item banks developed by HESI and stored within ParSYSTEM. E2s were assembled within ParSYSTEM and were exported to the SIMCAT program to administer the E2s and to apply the predictability model to the data. The exams were available in diskette and network versions and, during the study period, were in DOS format; however, the format has since been upgraded to Windows operating system.

 

METHODOLOGY

Eighty administrations of 14 different E2s to students at 54 RN and 8 PN schools of nursing were conducted during the academic year 1996-97. HESI's SIMCAT program was used to administer and score the E2s for both the RN and PN groups; however, the test items contained on these exams were different. E2s contained questions that were appropriate for the type of student being tested. Additionally, normative data were established separately on the RN group and the PN group, and these data were stored with the test items in separate RN and PN item banks. Individual student probability scores were calculated using HESI's predictability model, and the student's probability of passing the licensing exam was printed upon completion of the E2. The schools' summary reports reflected cumulative data obtained from scoring individual students' exams.

 

All E2s were secure, computerized exams. At the completion of the exam, the test items were programmed to erase from the diskette, or from the student's file if the network version was used. Students read the rationales for items missed at the completion of the test and after the review rationales were programmed to erase. Such measures helped to provide security for the exams.

 

SAMPLE

Students

Individuals enrolled in an RN or PN school of nursing who took the E2 administered by their particular school of nursing within 1 to 4 months before graduation comprised the student sample. The sample consisted of 2613 RN students, which included 1991 associate degree students (ADN), 563 baccalaureate students (BSN), and 59 diploma students, as well as 196 PN students, for a total of 2809 students.

 

Schools

Those schools that administered the E2 between August 1996 and July 1997 comprised the school sample. The sample consisted of 54 RN programs (35 ADN, 17 BSN, and 2 diploma schools of nursing) and 8 PN programs, for a total of 62 schools.

 

DATA COLLECTION

The predictive accuracy of the E2 was examined by surveying each participating school's response to a two-item questionnaire. Letters explaining the purpose of the study were mailed to the deans/directors of each school that purchased E2s during the 1996-97 academic year. Accompanying this letter was an addressed, postage-paid post card containing the two-item questionnaire. The first question asked if the E2 was administered in a monitored situation. Although monitoring was not explained on the questionnaire, it was meant to be interpreted as proctoring the exam administration. Some respondents wrote in who had monitored or proctored the exam, but most simply checked the yes or no response. The second question asked how many of the students predicted by the E2 to pass the NCLEX-RN or NCLEX-PN failed the licensing exam.

 

DATA ANALYSIS

Of the 54 RN schools that comprised the study group, 51 (94.44%) responded to the questionnaire. The responding RN schools represented 2555 students or 97.78% of the RN students who took the E2 during the study period. These RN responding schools consisted of 1976 (77.34%) from ADN programs, 520 (20.35%) from BSN programs, and 59 (2.31%) from diploma schools of nursing. The RN respondents were from 23 states across the United States.

 

The PN sample was considerably smaller than the RN sample. Of the 8 PN schools, 7 (87.50%) responded, which represented 170 of the 196 PN students (86.73%) who took the E2 during the study period. Of the responding PN schools, 5 (71.43%) were from the southwest, one (14.29%) was from the northwest, and one (14.29%) was from the north central part of the United States.

 

Of the total 2725 respondents, 1313 (48.18%) were predicted by the E2 to pass the licensing exam without any additional preparation. Of the 2555 RN respondents, 1248 (48.84%) were predicted to pass the NCLEX-RN: 986 (49.90%) ADN, 218 (41.92%) BSN, and 44 (74.58%) diploma students. Of the 170 PN student respondents, 65 (38.24%) were predicted to pass the NCLEX-PN without any additional preparation.

 

Thirty-four of the 1248 RN respondents who were predicted by the E2 to pass the NCLEX-RN failed the licensing exam: 27 from ADN programs, 7 from BSN programs, and none from the two diploma programs. However, 24 of these 34 failures were from 15 schools that administered the E2 in unmonitored situations. Nineteen of the unmonitored failures were from ADN programs, 5 were from BSN programs, and none were from the two diploma schools. Nine of the 24 unmonitored failures (37.50%) were from one ADN school of nursing. Table 1 describes the number and percent of respondents in each group (ADN, BSN, diploma, and PN), the number and percent in each group that were predicted by the E2 to pass the licensing exam, and the number of predicted passes in each group that failed the licensing exam.

  
Table 1 - Click to enlarge in new windowTable 1 HESI Exit Exam Predicted Passes on NCLEX that Failed by Program

Most schools monitored or proctored students while taking the E2. The monitored group consisted of 1971 students (72.33%) and the unmonitored group consisted of 754 students (27.67%). Of the 51 RN schools that responded to the questionnaire, 36 RN schools (70.59%) monitored the administration of the E2, and 15 (29.41%) did not monitor the exams. The 36 monitoring RN schools tested 1856 students (72.64%) of the RN respondents: 1417 (71.71%) ADN, 388 (74.62%) BSN, and 51 (86.44%) diploma students were monitored while taking the E2. In the PN group, 5 (71.43%) schools, which tested 115 students (67.65%), administered the E2 in monitored situations.

 

Only 10 (0.51%) of the 1971 monitored students who were predicted to pass failed the licensing exam, whereas 24 (3.18%) of the 754 unmonitored students who were predicted to pass failed the licensing exam. The 10 failures from monitored administrations of the E2 were from 9 RN schools of nursing, 8 of which were from ADN programs, 2 from BSN programs, and none from the two diploma programs. None of the PN respondents who were predicted to pass the NCLEX-PN failed the licensing exam. For all groups of students, the E2 was determined to be 99.49% accurate in predicting success on the licensing exam when administered in monitored situations and 96.82% accurate when administered in unmonitored situations. Table 2 describes the accuracy of E2 predictions in relation to monitoring. Using a [chi]2 test of significance, the accuracy of E2 predictions was found to be significantly greater ([chi]21 = 4.98, P = .05) for monitored administrations of the exam as opposed to unmonitored administrations (Table 3).

  
Table 2 - Click to enlarge in new windowTable 2 Accuracy of HESI Exit Exam Predictions in Relation to Monitoring
 
Table 3 - Click to enlarge in new windowTable 3 [chi]

The accuracy of predictions of NCLEX-RN/PN success by type of nursing program was tested using a [chi]23 test of significance. No significant difference was found ([chi]23 = 2.49, P = .01) in the predictive accuracy of the E2 among the types of programs tested, ie, ADN, BSN, diploma, and PN programs (Table 4).

  
Table 4 - Click to enlarge in new windowTable 4 [chi]

INTERPRETATION OF FINDINGS

The E2 was found to be highly predictive of students' success on the licensing exam. The accuracy of E2 predictions of success on the NCLEX was so close to the students' actual outcome on the licensing exam, that the probability that these results could have occurred by chance alone was infinitesimal (P < .001). Predictions made by the E2 were consistently high regardless of the type of program tested; there was no significant difference in the accuracy of predictions among the groups tested: ADN, BSN, diploma, and PN programs. With or without monitoring the administration of the E2, the instrument was found to be an accurate predictor of students' success on the licensing exam. However, some schools did not monitor or proctor the administration of the E2, but used it as a teaching tool, and distributed the E2 diskettes to the students to complete at their convenience. In fact, some students were permitted to complete the exams at home and were encouraged to use resources to help them answer the questions. As might be expected, predictions made by the E2 were significantly more accurate when administration of the exam was monitored than when it was not monitored.

 

Monitoring of the E2 did not appear to be of importance for the RN diploma or PN groups in that the E2 was 100% accurate in predicting NCLEX success for both monitored and unmonitored administrations of the exam. However, the small sample size of both groups must be considered when interpreting this finding. The RN diploma group consisted of only 59 (2.31%) of the 2555 RN respondents. The PN group consisted of 170 respondents compared with 2555 RN respondents. Additionally, a smaller percentage of PN students (38.24%) were predicted to pass the licensing exam than RN students (48.84%). The E2 was administered 1 to 4 months before graduation for both the RN and PN groups. In the 1-year PN curriculum, 1 to 4 months before graduation constitutes a far larger percentage of the total curriculum than does the same time period for a 2-year to 4-year RN curriculum. Therefore, fewer PN students were predicted to pass the licensing exam because they had been exposed to proportionately less of their curriculum than had the RN students. Those PN students who were predicted to pass the licensing exam had likely developed a broad knowledge base early in their curriculum that may have contributed to their success on the NCLEX-PN. Consequently, the PN students predicted by the E2 to pass the NCLEX-PN were in less danger of failing the licensing exam than were the RN students.

 

CURRICULUM EVALUATION

Further data regarding curriculum subject categories were provided by the E2 individual student printout of scores, or Score Summary. Additionally, a cumulative summary report that described average responses of the school group to each of the subject area categories tested was prepared for schools that purchased the comprehensive exam. Comparative percentile scores on more than 50 different subject area categories were included. These subject areas constituted the client need categories, the nursing process categories, and the subcategories of five specialty areas: Medical Surgical, Maternity, Pediatric, Psychiatric/Mental Health, and Gerontological nursing. The comparative scores, which might be useful for curriculum evaluation, provided a method for schools to compare their students with students throughout the United States who had answered the same test items previously. The PN student printout, or Score Summary, contained one less nursing process area, Analysis, which was not tested on the PN exam. This Score Summary was revised for the 1998 RN testing to include the 12 subcategories for the client need categories as identified in the National Council's Test Blueprint, effective April 1998.17

 

CONCLUSIONS AND RECOMMENDATIONS

The E2, a computerized instrument, appears to be highly predictive of students' success on the NCLEX-RN and the NCLEX-PN. However, the small PN sample size compared with the RN sample size, and the lack of geographic diversity among the PN group, suggest that replication of this study with a larger PN sample is indicated.

 

Because the E2 is a computerized instrument, it provided students' individual scores at the completion of the exam, enabling remediation to be initiated immediately if indicated. Further, the summary analysis produced by the E2 enabled faculty to compare their students' performance on more than 50 different subject area categories with the performance of students across the United States who had answered the same test items previously. These comparisons could be used as a guide for students preparing for the licensing exam and possibly by faculties in evaluating nursing curricula.

 

Students viewed the rationales for items answered incorrectly at the conclusion of the exam. This feedback could have served as a learning experience and guided students in preparing for the licensing exam. Students at risk of failing the NCLEX were encouraged within the E2 summary report to seek varying degrees of additional preparation, and students who sought recommended remediation probably increased their chances of being successful. Success on the NCLEX examination in students who obtained low scores on the E2 was not studied. The expected influence of remediation and additional preparation on the student's ability to succeed was considered a separate study. Therefore, further research is indicated to study the low scoring E2 students' success on the licensing exam as well as factors significant to low scoring students becoming successful first-time takers of the licensing exam. Nevertheless, based on the findings of this study, the E2 was determined to be a highly predictive measure of students' ability to succeed on the NCLEX-RN and NCLEX-PN.

 

REFERENCES

 

1. Billings D, Hodson-Carlton K, Kirkpatrick J, et al. Computerized NCLEX-RN preparation programs: A comparative review. Comput Nurs. 1996;14:272-286. [Context Link]

 

2. Riner ME, Mueller C, Ihrke B, et al. Computerized NCLEX-RN and NCLEX-PN preparation programs comparative review. Comput Nurs. 1997;15:255-267. [Context Link]

 

3. Alexander JE Sr, Brophy GG. A five-year study of graduates' performance on NCLEX-RN. J Nurs Educ. 1997;36:443-445. [Context Link]

 

4. Arathuzik D, Aber C. Factors associated with National Council Licensure Examination-Registered Nurse success. J Prof Nurs. 1998;14:119-126. [Context Link]

 

5. Huepel C. A model for intervention and predicting success on the national council licensure examination for registered nurses. J Prof Nurs. 1989;10:112-117. [Context Link]

 

6. Wall BM, Miller DE, Widerquist JG. Predictors of success on the newest NCLEX-RN. West J Nurs Res. 1993;15:628-643. [Context Link]

 

7. Waterhouse JK, Bucher L, Beeman PB. Predicting NCLEX-RN performance: Cross-validating an identified classification procedure. J Prof Nurs. 1994;10:255-260. [Context Link]

 

8. Ashley J, O'Niel J. The effectiveness of an intervention to promote successful performance on NCLEX-RN for baccalaureate students at risk for failure. J Nurs Educ. 1991;33:360-366. [Context Link]

 

9. Barkley TW, Rhodes RS, Dufour CA. Predictors of success on the NCLEX-RN. Nurs and Health Care. 1998;19:132-137. [Context Link]

 

10. Fowles ER. Predictors of success on NCLEX-RN and within the nursing curriculum: Implications for early intervention. J Nurs Educ. 1992;31:53-57. [Context Link]

 

11. Ross B, Nice A, May F, Billings D. Assisting students at risk using computer NCLEX-RN review software. Nurse Educ. 1996;21: 39-43. [Context Link]

 

12. Jenks J, Selekman J, Bross T, Paquet M. Success in NCLEX-RN: Identifying predictors and optimal timing for intervention. J Nurs Educ. 1998;28:112-118. [Context Link]

 

13. National Council of State Boards of Nursing, Inc. NCLEX-RN examination test plan for the national council licensure examination for registered nurses effective date: October, 1995. Chicago, IL: National Council of State Boards of Nursing, Inc.; 1994. [Context Link]

 

14. National Council of State Boards of Nursing, Inc. The NCLEX process. 1995. [Context Link]

 

15. Bloom BS. Taxonomy of Educational Objectives: The Classification of Educational Goals. Handbook 1: Cognitive Domain. New York, NY: McKay; 1956. [Context Link]

 

16. Morrison S, Smith P, Britt R. Critical Thinking and Test Item Writing. Houston, TX: Health Education Systems, Inc; 1996. [Context Link]

 

17. National Council of State Boards of Nursing, Inc. NCLEX-RN examination test plan for the national council licensure examination for registered nurses effective date: April, 1998. 1997. [Context Link]